UCS Blog - The Equation (text only)

Congress Could Help Farmers, Prevent Pollution, and Reduce Flood and Drought Damage. Will They?

U.S. Department of Agriculture (USDA) Natural Resources Conservation Service (NRCS) Soil Conservationist Garrett Duyck and David Brewer examine a soil sample on the Emerson Dell farm near The Dalles, OR. USDA NRCS photo by Ron Nichols.

The news lately has been full of Congressional battles—healthcare, the debt ceiling, and now tax “reform” (ahem)—and it’s starting to seem like Congress is only interested in blowing things up. But a huge legislative effort is gaining steam on Capitol Hill, one that is likely to have general bipartisan support, though you probably haven’t heard nearly as much about it. I’m talking about the next five-year Farm Bill—which really should be called the Food and Farm Bill, as it shapes that sprawling economic sector worth more than 5 percent of US GDP, and which Congress must reauthorize by September 30, 2018.

In this first of a series of posts on the 2018 Farm Bill, I look at how this legislation could do more to help farmers conserve their soil, deliver clean water, and even reduce the devastating impacts of floods and droughts, all of which would save taxpayers’ money.

Farm conservation works

Since 1985, the Farm Bill has promoted stewardship of soil, water, and wildlife by directing funding to a variety of US Department of Agriculture (USDA) conservation programs. These programs provide financial incentives and technical assistance for farmers and ranchers to protect their soil and store carbon by planting cover crops, reduce fertilizer and pesticide use by rotating a mix of crops, capture excess fertilizer and add wildlife habitat by planting perennial prairie strips in and around vast cornfields, and even take environmentally sensitive acres out of farming altogether.

Recent UCS analysis has shown that farm practices like these lead to positive environmental outcomes while maintaining or increasing farmers’ yields and profits and saving taxpayers’ money.

And our latest report, Turning Soils into Sponges, reveals a surprising additional benefit: growing cover crops and perennial crops can make farmers and downstream communities more resilient to the effects of floods and droughts. The report demonstrates that these practices—which keep living roots in the soil year-round—result in healthier, “spongier” soils soak up more water when it rains and hold it longer through dry periods. Using these practices, farmers can reduce rainfall runoff in flood years by nearly one-fifth, cut flood frequency by the same amount, and make as much as 16 percent more water available for crops to use during dry periods. But farmers need help to do it.

A changing climate demands more conservation, not less

So it was a real step backward when the 2014 Farm Bill cut the very programs that help farmers build healthy soil and prevent pollution. That bill cut the USDA’s Conservation Stewardship Program (CSP), for example, by more than 20 percent. A USDA official recently told a Senate committee that CSP is “greatly oversubscribed” and must turn away thousands of farmers who want to participate.

(Incidentally, the Senate will hear this week from President Trump’s nominee to lead the USDA’s conservation efforts, whose conservation record as Iowa Secretary of Agriculture has been mixed.)

Meanwhile (surprise!) the problems that on-farm conservation can help solve are not going away by themselves. Midwestern farm runoff has led to deteriorating water quality from Iowa to the Gulf of Mexico. And climate change will only worsen water quality and increase the frequency and severity of floods and droughts.

The latter is particularly bad news for farmers, and for all of us. A new report from the USDA’s Risk Management Agency, which operates the taxpayer-subsidized federal crop insurance program, shows that losses from drought and flooding were to blame for nearly three-quarters of all crop insurance claims paid to farmers and ranchers between 2001 and 2015.

Farmers are adopting conservation practices, and policy support is growing

For example, earlier this year researchers at Iowa State University released the results of their 2016 Iowa Farm and Rural Life Poll, which asked farmers across the state about conservation practices they used between 2013 and 2015. Nearly half (44 percent) reported an increase in the use of practices to improve soil health, with 20 percent reporting they’d increased their use of cover crops.

Meanwhile, the National Farmers Union (NFU), which represents family farmers and ranchers across the country, has become increasingly vocal about the need for USDA programs and research to help farmers build soil health and cope with climate change. And taxpayer advocates have lent their voice to the call stronger requirements for on-farm conservation as a condition of participating in the federal crop insurance program (so-called conservation compliance). A number of states have undertaken healthy soil initiatives, and some observers expect soil health to get more attention in this Farm Bill, as it should.

Congress: Don’t ask farmers to do the impossible

To recap: farm conservation works, farmers want to do it, and we all need more of it to cope with a changing climate and the floods, droughts, and escalating costs it will bring. So why wouldn’t Congress invest more?

As usual, budget-cutting fever is the problem. The Trump administration’s proposed USDA budget reductions shocked farmers and their allies in Congress last spring, cowing even the powerful Republican chair of the Senate agriculture committee, who warned that the 2018 Farm Bill will need to “do more with less.” That’s a silly thing to say, of course…with most things in life, doing more requires, well, more. For farm conservation, that means financial incentives and technical assistance for more farmers and more acres, along with more monitoring to ensure that it’s getting results.

That’s why UCS joined with NFU and two dozen other organizations in outlining our collective conservation priorities for the 2018 Farm Bill. These include a substantial increase in funding for USDA conservation programs including CSP, along with additional monitoring and evaluation of outcomes, better enforcement of conservation compliance, and improvements in the federal crop insurance program to remove barriers to conservation.

As Congress debates the Farm Bill in the coming months, UCS will be urging them to see farm conservation programs for what they are—critical programs to help farmers stay profitable today while preventing pollution, improving resilience, and avoiding more costly problems down the line.

In short, an excellent investment in our future.

Why Going 100% Electric in California Isn’t as Crazy as it Might Seem

Electric vehicle charging stations line the perimeter of San Francisco's City Hall. Photo: Bigstock.

California’s top air pollution regulator, Mary Nichols, made headlines last week after making comments to a Bloomberg reporter about the possibility of banning gasoline cars in California.  Shortly after that, California Assembly member Phil Ting announced he would introduce state legislation to do just that. Skeptics may raise their eyebrows, but if California is going to meet its long term climate and air quality goals then nearly all future cars and trucks must be powered by renewable electricity and hydrogen. The good news is the state is already on this path.

Our health and our climate depends on vehicle electrification

It’s no secret that widespread vehicle electrification is needed to meet California’s climate and air quality goals. In 1990, the first Zero Emission Vehicle program was adopted – an acknowledgment that vehicles with zero tailpipe emissions were necessary to ensure healthy air in a state with a growing population and a whole lot of cars.

Climate change has only added to the importance of vehicle electrification, which takes advantage of the efficiency of electric motors and the ability to power vehicles with renewable electricity or hydrogen (fuel cell vehicles have an electric motor and zero tailpipe emission similar to battery electric cars).

The state’s recent assessment of vehicle technologies needed to meet our climate and air quality goals shows the importance of widespread vehicle electrification suggesting all sales of new cars should be electric by 2050 (including plug-in hybrids or PHEVs).  A national assessment, Pathways to Deep Decarbonization in the United States, and a California assessment, also point out a large-scale transition to electric vehicles (EVs) is needed to achieve the level of emission reductions needed to avoid dangerous climate change.

Figure 1: From a presentation by staff to the Air Resources Board in March 2017 showing that by 2050 the majority of cars on the road – and all of new car sales – are powered by electric motors.

Banning gasoline and diesel gains popularity  

In the wake of VW’s Dieselgate and with the impacts of climate change becoming more and more apparent –  banning the sale of internal combustion vehicles is becoming a popular policy choice around the world, with France, Britain, India and China all making big splashes with recent commitments to eliminate them at some point in the future.

With these strong commitments gathering steam, some one might ask if California is somehow losing its leadership on EVs.  California isn’t losing its leadership, it’s starting to share it with many more parts of the globe.  This is great news, as increased global demand for EVs will help drive down technology costs for everyone and help automakers recoup their investments in EV technology faster.

But is going to 100% electric vehicles practical? It might be hard to imagine a time when every car at your local dealership will be electric. But there are reasons to be bullish on the future of EVs. Battery prices are dropping with estimates that EVs could have comparable costs to gasoline vehicles sometime in the 2020s. And recent announcements by major manufacturers like Ford, GM, Volvo, VW and others about expanding electric vehicle line-ups over the next 5 years indicates the industry is betting on growth opportunities.

Figure 2: As recently noted in a blog by my colleague David Reichmuth,  battery costs are declining and approaching the point where EVs achieve cost parity ($125-150 per kWh).

California is taking the right steps to making electric cars an option for more and more drivers

In addition, California is implementing policies to support the deployment of EVs.  There’s a long list, but some of the most critical are direct consumer rebates, incentives targeting low- and moderate-income households, utility investments to support the deployment of EV charging infrastructure, the Low Carbon Fuel Standard, and the Zero Emission Vehicle program, which requires automakers to bring EVs to market. Meanwhile, California’s relatively clean electricity grid means that driving an EV results in global warming emissions equivalent to a 95 mile-per-gallon gasoline car. As California increases its reliance on electricity from renewable sources, emissions will continue to decline.

Long-term goals must be matched with near-term action

Adopting a ban on gasoline and diesel cars would certainly send a strong long-term signal that powering electric vehicles with clean energy is our ultimate destination. It could focus policy makers’ and regulators’ efforts on supporting the transition and give automakers, charging companies, utilities, and entrepreneurs a vision and long-term target for the future to guide their investments.

However, it’s the near-term efforts to make EVs more accessible to all Californians that will accelerate the transition. That means expanding current programs targeted toward individuals and businesses who buy or use new and used cars and increasing access to charging. And it also means supporting electrification for those who rely on other modes of transportation too (see my colleague Jimmy’s blog on electric buses).

A future without internal combustion engine cars is consistent with a future of clean air and minimizing climate impacts. Ultimately, for a transition to a clean, electric transportation system to succeed, the system needs to be better than the one we have today. And it’s the policies we implement today that will drive the investments needed to reach a tipping point, a point where choosing the EV is a no brainer for whomever is shopping for a car.

 

Can Science (and The Supreme Court) End Partisan Gerrymandering and Save the Republic? Three Scenarios

Photo: Wikimedia Commons

On October 3, the US Supreme Court will hear a case concerning the state of Wisconsin’s legislative districts that could resolve a pending constitutional crisis and dramatically improve electoral representation.

At the center of the dilemma is the applicability of a scientific standard to measure discrimination resulting from district boundary manipulation. What’s new in this case is that social scientists have developed a standard. But what the court will do with it is anybody’s guess. So let’s guess.

We have a scientific standard that is discernible and manageable

Social scientists have been hard at work since 2004, when the Supreme Court issued a fragmented, 5-4 decision in Vieth v Jubelier holding that “plaintiffs failed to establish a standard” to determine when partisan gerrymandering has gone too far  The analytical tools for estimating various forms of partisan discrimination have dramatically improved since Vieth, as described in one of the many amicus briefs submitted to the court.

Consensus has emerged around partisan asymmetry as a scientific standard that is both discernible (logically grounded in constitutional protections) and manageable (so that courts can apply it). It measures any difference in the percentage of seats that a given percentage of voters (say 50%) receive, depending on what party they vote for. Asymmetries can be easily estimated with actual election results and computationally simulated vote swings across districts, along with measures of statistical confidence.

Similarly, the mean-median test, comparing each party’s actual vote share in its median district to overall mean vote share, is another way of estimating asymmetries between voters. There are important theoretical and methodological differences between various measures, including the efficiency gap, which compares “wasted” votes between parties. But all are empirically accurate at identifying partisan bias where it matters most: in competitive states where voters from one party have a major seat advantage.

However, the fact that a standard has emerged is no guarantee that it will be adopted. Attention will focus on convincing Justice Anthony Kennedy, who welcomed the discovery of “workable standards” as the swing vote in Vieth. His level of satisfaction with these results is likely to drive the justices toward one of the following three scenarios.

Scenario one: Kennedy keeps the Supreme Court out of the thicket

In a crushing defeat to defendants and electoral reformers in both parties, Justice Kennedy is unpersuaded, leading to another 5-4 decision in which the more liberal justices (Ginsburg, Breyer, Kagen, and Sotomayor) agree that symmetry is a workable standard, but they don’t have the votes. A plurality of the court’s conservatives either dismiss outright the idea that courts ought to be entering the political thicket of partisan competition, or they reassert a version of Antonin Scalia’s Vieth opinion, holding that symmetry is a standard measuring discrimination against parties, not people, with only the latter having constitutional rights (although it has been demonstrated that symmetry reflects individual political equality).

Kennedy writes a concurrent opinion with the conservatives, articulating a more nuanced failure on the part of plaintiffs to specify “how much is too much” as both plaintiffs and most of the scientific briefs submitted explicitly placed responsibility for specifying a threshold of unconstitutional discrimination with the courts. Kennedy could also point to in-fighting among political scientists over our favored measures as lack of consensus. Talk about a tragedy of the commons.

Scenario two: Wisconsin’s districts are thrown out, but the real work is left for future courts

A focused interrogation by Kennedy results in a majority opinion that overturns Wisconsin’s gerrymandered map. Several measures of bias are incorporated into a multi-pronged test that verifies if 1) the district boundaries caused the observed discrimination (asymmetry), and 2) the extent of asymmetry is not likely to be reduced through changing voter preferences. That is, even a “wave” of public opposition would allow the entrenched party to hold power.

However, the majority does not go so far as to prescribe a general threshold for “how much is too much” gerrymandering. There is no precise level of necessary asymmetry, or responsiveness, or competitiveness specified that constitutes a violation of equal protection or free speech. Standards are left to emerge through future cases, of which there are many. Some version of this outcome seems most likely, given the scientific consensus, the level of extreme gerrymandering witnessed in the 2011 redistricting cycle, and the bipartisan response to it.

Scenario three: A precise standard is adopted with clear direction for lower courts

In this third, and probably least likely scenario, the justices not only establish a multi-prong test to identify unconstitutional partisan discrimination, they also specify the degree of relief that discriminated voters are entitled to. The question of “how much is too much” discrimination is answered precisely through a specific measure, either when asymmetry would result in a single seat change, or change in majority control of a legislative body, or a mean-median difference greater than 5 percent (which is rare) or an efficiency gap greater than 7 percent (which is also rare), etc.

The court could apply the breadth of knowledge that we have to specify thresholds of tolerance, below which any hypothetical districting plan would be invalidated. But because the process of districting involves maximizing numerous conflicting principles, such as geographic compactness and bias, the justices are unlikely to go this far, at this time. And only time will tell if a more cautious approach will be adequate.

Can a constitutional crisis be averted?

If the Supreme Court fails to rein in partisan gerrymandering, the fundamental democratic principle of majority rule is undermined. The Electoral College has enabled minority control over the executive, and majority control of the Senate has been determined by a minority of voters due to the underrepresentation of large states.

In 2018, a majority as large as 56 percent of Americans could vote against the governing party (currently the Republican Party) in the House of Representatives, while they retain control of a majority of (gerrymandered) seats. It is up to the Supreme Court to re-establish a republic “of the people.”

On Oct 3, the Supreme Court of the

On Oct 3, the Supreme Court of the United States will hear a case concerning the state of Wisconsin’s legislative districts that could resolve a pending constitutional crisis (see below), and dramatically improve electoral representation. At the center of the dilemma is the applicability of a scientific standard to measure discrimination resulting from district boundary manipulation. What’s new in this case is that social scientists have developed a standard. But what the court will do with it is anybody’s guess. So let’s guess.

We have a scientific standard that is discernable and manageable

Since 2004, when the Justices issued a fragmented, 5-4 decision in Vieth v Jubelier holding that “plaintiffs failed to establish a standard” to determine when partisan gerrymandering has gone too far, social scientists have been hard at work. The analytical tools for estimating various forms of partisan discrimination have dramatically improved since Vieth, as described in one of the many amicus briefs submitted to The Court.

Consensus has emerged around partisan asymmetry as a scientific standard that is both discernable (logically grounded in constitutional protections) and manageable (so that courts can apply it). It measures any difference in the percentage of seats that a given percentage of voters (say 50%) receive, depending on what party they vote for. Asymmetries can be easily estimated with actual election results and computationally simulated vote swings across districts, along with measures of statistical confidence.

Similarly, the mean-median test, comparing each party’s actual vote share in its median district to overall mean vote share, is another way of estimating asymmetries between voters. There are important theoretical and methodological differences between various measures, including the efficiency gap, which compares “wasted” votes between parties. But all are empirically accurate at identifying partisan bias where it matters most: in competitive states where voters from one party have a major seat advantage.

However, the fact that a standard has emerged is no guarantee that it will be adopted. Attention will focus on convincing Justice Anthony Kennedy, who welcomed the discovery of “workable standards” as the swing vote in Vieth. His level of satisfaction with these results is likely to drive The Justices toward one of the following three scenarios.

Scenario One: Kennedy keeps SCOTUS out of the thicket

In a crushing defeat to defendants and electoral reformers in both parties, Justice Kennedy is unpersuaded, leading to another 5-4 decision in which the more liberal justices (Ginsburg, Breyer, Kagen, Sotomayor) agree that symmetry is a workable standard, but they don’t have the votes. A plurality of the court’s conservatives either dismiss outright the idea that courts ought to be entering the political thicket of partisan competition, or they reassert a version of Antonin Scalia’s Vieth opinion, holding that symmetry is a standard measuring discrimination against parties, not people, with only the latter having constitutional rights (although it has been demonstrated that symmetry reflects individual political equality).

Kennedy writes a concurrent opinion with the conservatives, articulating a more nuanced failure on the part of plaintiffs to specify “how much is too much” as both plaintiffs and most of the scientific briefs submitted explicitly placed responsibility for specifying a threshold of unconstitutional discrimination with the courts. Kennedy could also point to in-fighting among political scientists over our favored measures as lack of consensus. Talk about a tragedy of the commons.

Scenario Two: Wisconsin’s districts are thrown out, but the real work is left for future courts

A focused interrogation by Kennedy results in a majority opinion that overturns Wisconsin’s gerrymandered map. Several measures of bias are incorporated into a multi-pronged test that verifies if 1) the district boundaries caused the observed discrimination (asymmetry), and 2) the extent of asymmetry is not likely to be reduced through changing voter preferences. That is, even a “wave” of public opposition would allow the entrenched party to hold power.

However, the majority does not go so far as to prescribe a general threshold for “how much is too much” gerrymandering. There is no precise level of necessary asymmetry, or responsiveness, or competitiveness specified that constitutes a violation of equal protection or free speech. Standards are left to emerge through future cases, of which there are many. Some version of this outcome seems most likely, given the scientific consensus, the level of extreme gerrymandering witnessed in the 2011 redistricting cycle, and the bipartisan response to it.

Scenario Three: A precise standard is adopted with clear direction for lower courts

In this third, and probably least likely scenario, Justices not only establish a multi-prong test to identify unconstitutional partisan discrimination, they also specify the degree of relief that discriminated voters are entitled to. The question of “how much is too much” discrimination is answered precisely through a specific measure, either when asymmetry would result in a single seat change, or change in majority control of a legislative body, or a mean-median difference greater than 5% (which is rare) or an efficiency gap greater than 7% (which is rare), etc.

The Court could apply the breadth of knowledge that we have to specify thresholds of tolerance, below which any hypothetical districting plan would be invalidated. But because the process of districting involves maximizing numerous conflicting principles, such as geographic compactness and bias, the Justices are unlikely to go this far, at this time. And only time will tell if a more cautious approach will be adequate.

Can a constitutional crisis be averted?

If the Supreme Court fails to rein in partisan gerrymandering, the fundamental democratic principle of majority rule is undermined. The Electoral College has enabled minority control over the executive, and majority control of the Senate has been determined by a minority of voters due to the underrepresentation of large states.

In 2018, a majority as large as 56% of Americans could vote against the governing party (currently the Republican Party) in the House of Representatives, while they retain control of a majority of (gerrymandered) seats. It is up to the Supreme Court to re-establish a republic “of the people.”

President Trump is About to Give a Speech That Directly Undermines Science

Next week, President Donald Trump is going to deliver a speech highlighting his only major policy “achievement” to date: sidelining science and rolling back critical public health, safety, and environmental protections. You’re probably going to hear a lot about the president’s absurd executive order that requires agencies to cut two regulations (aka public health protections that provide us clean air, safe consumer products, and more) for every new one issued and you’ll probably hear some muddled thinking and misinformation about the cost of regulations and all the rules the administration has reversed.

The president might even frame all this deregulatory talk as “winning.”

But this is not what his speech is about. This speech is about President Trump and his administration sidelining science-based safeguards, stripping away vital public health, safety, and environmental protections from the American people. These are regulations that keep our air and water clean, our food safer to eat, our household products and our kids’ toys safer to play with, and our workers safer at work. And it is these regulations that can and should have the greatest positive impact on low-income communities and communities of color, who are often disadvantaged and facing some of the worst public health and environmental threats.

Deregulation = Real world impacts

We’ve already seen the administration’s deregulatory policies in action. Earlier this year, the administration delayed updates to the Risk Management Program, designed to enhance chemical risk disclosure from industrial facilities and improve access to information for first responders, workers, and fenceline communities, all while encouraging the use of safer technologies.

After Hurricane Harvey hit Houston, Arkema’s chemical plant in Crosby exploded, highlighting the importance of this public protection. People were forced to stay away from their homes, first responders suffered injuries and weren’t informed about the dangerous chemicals being stored there (and are now suing Arkema), and Harris County had to divert critical resources from hurricane recovery efforts to respond to the explosion (and is now also suing Arkema).

This is just one example of how sidelining science and rolling back safeguards can negatively impact communities across the country. In a recently released report, UCS chronicled several examples of how the administration has delayed many science-based rules and weakened protections from hazards at work and home. This is what the president’s speech is about.

Science-based policymaking ¯\_(ツ)_/¯

This administration has shown zero interest in evidence-based policymaking. Even when it comes to rolling back regulations, the administration has used inaccurate information to support its actions. In other instances, it has simply used misleading information to support its delay tactics. The Environmental Protection Agency (EPA), whose mission is to protect human health and the environment, has an administrator who is only interested in meeting with representatives from regulated industries, instead of meeting with independent scientists and communities who need the federal government to step up and implement strong protections.

What the administration is focused on though is using any means available to them to invalidate public health protections that took years to develop.

All this flies in the face of how science-based policymaking should happen. You look at the threat, the scientific and technical evidence, and then figure out how to mitigate it and ensure the public is not in danger. You don’t arbitrarily decide which public protections should stay in place and which should be rolled back. Nor should our government only take input from vested interests who favor their bottom line over protecting the public.

But that is the Trump Doctrine on regulations. And for this reason, scientists need to continue to watchdog the administration’s actions and hold agencies accountable to ensure that we have science-based protections in place and policies are based on facts not politics.

Threats are threats. They cannot be addressed only when another public protection is no longer on the books. In the future, if the Food and Drug Administration were to issue a rule to ensure safe food, should the EPA be forced to roll back standards for clean water?

The bottom line is when it comes to protecting public health, the ideas championed by President Trump make no sense. Regulations matter, and protecting the system of evidence-based policymaking matters. The only thing President Trump’s speech will be good for is to show the American people how many losses we have taken in the first 10 months of this administration.

One Lesson For DOE From Harvey & Maria: Fossil Fuels Aren’t Always Reliable

Photo: Chris Hunkeler/CC BY-SA (Flickr)

The US Department of Energy has proposed that paying coal plants more will make the grid reliable. But last month, three feet of rain from Hurricane Harvey at a coal plant in Fort Bend, Texas complicated the messaging around the reliability of fossil fuels in extreme weather. The vulnerability of power grids to storm damage is also on horrible display in Puerto Rico in the aftermath of Hurricane Maria.

Past studies by the Union of Concerned Scientists have highlighted risks from worsening storms and grid issues. The demonstrated risks are in the wires, not the types of power plants.

The damage and hardships in Puerto Rico are expected to exceed past US storm impacts when measured in number of people out of service and number of hours of the outage,. Those storms stirred efforts to make the power system more reliable and resilient to extreme weather.

Recently, new debates have arisen regarding the more contentious but less-relevant (and erroneous) argument that “base-load” plants are the single best provider of grid reliability. In a market where coal-burning plants are losing money and closing, coal’s champions argue that a long list of reliability features of coal are unique and valuable. Now that the owner of the W.A. Parish plant in south Texas reported it shifted 1,300 MW of capacity from coal to gas due to rainfall and flooding disrupting power plant operations in the aftermath of Hurricane Harvey, yet another of these claims about the unique advantages of coal for electricity has been muddied by facts.

Plant owner NRG reported to the Public Utility Commission of Texas that W.A. Parish units 5 and 6 were switched to burn natural gas due to water saturating the coal. The subbituminous coal stored on site is supposed to be a reliability advantage, according to those pushing coal. As that debate heats up (the DOE is seeking vague and unspecified changes to compensation in the electricity markets for plants that have a fuel supply on-site), the too-simple notion that reliability is created by power plants rather than grid operations that integrate all sources will be put to the test.

Some policymakers have asserted that solid fuel stored on-site is superior to natural gas, wind, and solar. Oil is a player too: although it’s a very small part of the electricity fuel supply in the mainland US, that’s not the case in places like Puerto Rico, Hawaii, or the interior of Alaska, where it’s the primary fuel.

People in Puerto Rico use oil to fuel private back-up generators. This too is not unique. Hospitals, police stations, and other pieces of critical infrastructure have historically relied on backup generators powered by fossil fuels for electricity supply during blackouts. However, this requires steady and reliable access to fuel. Puerto Rico is now experiencing a fuel supply crisis, as challenges throughout the supply chain have made it extraordinarily challenging to keep up with the demand around the island. After Sandy damaged the New Jersey – New York metropolitan area, many subsequent crises arose because so many back-up generators there failed, including due to inadequate fuel deliveries.

Fortunately, renewable energy and battery storage technology have advanced rapidly in the aftermath of Sandy, and the Japanese earthquake that destroyed the Fukushima nuclear plant. Solar panels combined with energy storage are now a viable alternative to back-up generators. This combination has the great advantage over back-up oil-burning of providing economic savings all year, as well as serving in an emergency. Even apartment buildings and low-income housing can gain the benefits of solar-plus-storage as a routine and emergency power supply.

Puerto Rico has a great solar resource, and the sun delivers on schedule without regard to the condition of the harbors or roads. Additional back-up power supplies there should be built from solar-plus-storage, so the people depending on electricity need not worry about fuel deliveries, gasoline theft, or dangers from fuel combustion. In Texas, the grid has already absorbed more wind power than any other US state. The next energy boom in Texas will be solar.

These are real resiliency and reliability improvements.

Photo: Chris Hunkeler/CC BY-SA (Flickr)

Pointless Delay to the Added Sugar Label Keeps Consumers in the Dark

In another frustrating example of undermining science-based protections, the FDA this morning proposed delaying compliance for revisions to the Nutrition Facts label.

Most food companies were supposed to roll out their revised labels by July 2018. This delay would mean that those initial, larger companies would have until January 2020 and smaller companies until January 2021.

I have been dreading this official announcement all year and hoping—as more and more products I see in stores have updated their labels—that the FDA would acknowledge that its original rule was perfectly reasonable and has already given companies ample time to comply.

In December, food industry leaders proposed two different riders to draft House appropriations legislation that would have delayed the rule. Luckily, those failed to make it into final language.

Then, in April at now-FDA Commissioner Scott Gottlieb’s confirmation hearing, he implied that he might delay the revised nutrition facts label. I urged Gottlieb to keep the compliance dates as a part of the final rule that was issued in 2016.

Once confirmed, Gottlieb was faced with what I would consider a pretty clear-cut decision: Implement a rule that was based in clear science on the public health consequences associated with excessive added sugar consumption—one that was also supported by the expert-driven Dietary Guidelines recommendations—or cow to industry wishes to delay the rule, even though the majority of food companies would have had until 2019 to make the new changes to their labels, and larger food companies like Mars, Inc. and Hershey Co. have already met the deadline or are on track to meet it.

In fact, according to the Center for Science in the Public Interest, at least 8,000 products from a variety of companies already bear the new label.

A few months later, the FDA announced of its intention to push back compliance dates, but there was no formal decision or indication of how long the delay would be. I, again, urged Gottlieb not to take a step backward on food label transparency by delaying the new label.

Despite what some food companies will have you believe, they have had plenty of time to accept the science on added sugar consumption and to give consumers the information for which they’ve been clamoring. The FDA first began its work to revise the nutrition facts label in 2004, and the proposed rule which included the added sugar line was issued in 2014. Industry has had over ten years to give consumers the information they want to make informed decisions, and to acknowledge the mounting evidence that excessive sugar consumption can lead to adverse health consequences, including heart disease, obesity, diabetes, and hypertension.

Instead, as we demonstrated in a 2015 analysis of public comments on the FDA’s proposed rule, the majority of unique comments supported the rule (99 percent of whom were public health experts), while 69 percent of those opposed to the rule were from the food industry. The companies’ reasons for opposition included flimsy arguments about consumers’ ability to understand nutrition labels.

Last week, we signed onto a letter along with twenty other science, public health, and consumer organizations urging Gottlieb to let the rule move forward. As we wrote in the letter, this delay means that “an entire cycle of the Dietary Guidelines for Americans will have passed without the federal government’s premier public-health regulatory agency taking final action to implement a major recommendation of the Guidelines.”

It also means that consumers will have to continue to guess how much of the sugar in their food is added, gambling on healthy food purchasing decisions. While asking the agency to delay its labeling rules, the sugar industry seems to understand that it’s actually time to reformulate and meet consumer demand for healthier products to win consumers’ trust. A surefire way to win our trust would have been to move forward with the label, not force us to wait another year and a half for information we have the right to know.

The FDA’s failure to follow the science and listen to public health experts, including HHS staff who helped write the most recent Dietary Guidelines, is incredibly disappointing. We will be weighing in on this decision with comments that will be accepted for 30 days after October 2nd and will update you on how you can tell the FDA to rescind its rule to delay the enforcement dates for added sugar labeling.

Pruitt Guts The Clean Power Plan: How Weak Will The New EPA Proposal Be?

News articles indicate that the EPA is soon going to release a “revised” Clean Power Plan. It is very likely to be significantly weaker than the original CPP, which offered one of the country’s best hopes for reducing carbon emissions that cause global warming.

EPA Administrator Scott Pruitt and President Trump have made no secret about their intent to stop and reverse progress on addressing climate change, so there’s every reason to expect that the revised CPP will be fatally flawed and compromised.

Here’s how we’ll be evaluating it.

How we got here

In August 2015, the EPA issued final standards to limit carbon emissions from new and existing power plants, a historic first-ever step to limit these emissions. Those standards, developed under the Clean Air Act, came about as a result of a landmark 2007 Supreme Court ruling and subsequent Endangerment finding from the EPA.

The final Clean Power Plan (CPP) for existing power plants was projected to drive emissions down 32 percent below 2005 levels by 2030, while providing an estimated $26 billion to $45 billion in net benefits in 2030.

In March 2017, President Trump issued an executive order blocking the Clean Power Plan. He claimed to do so to promote “energy independence and economic growth,” (despite the fact that the US transition to cleaner energy continues to bring significant health and economic benefits nationwide.) The EPA then embarked on a process of implementing the EO, including initiating a review of the CPP.

The US Court of Appeals for the DC Circuit has granted two stays in court challenges related to the CPP, the most recent of which was issued on August 8 for a 60-day period.  These stays were specifically to give the EPA time to review the rule; this in no way changes the agency’s “affirmative statutory obligation to regulate greenhouse gases.”

The EPA is currently expected to issue a revised CPP by October 7, aiming to head off litigation on this issue. Of course, if the plan they issue is a weak one, as it is likely to be the case, there is no question that court challenges will continue.

EPA’s most recent status update filed with the DC Circuit confirms that the agency has sent a draft rule to the Office of Management and Budget, and Administrator Pruitt expects to sign the proposed rule in fall 2017. This will begin a comment period on the new draft rule before it can be finalized.

Five Metrics for Assessing the Revised Clean Power Plan Proposal

While we don’t yet know exactly how the proposed rule will look, there are some key things we’ll be watching for:

1. Will the revised plan cut power sector carbon emissions at least as much as the original CPP?

Not likely. Reports indicate that reductions might be limited to what can be achieved through measures at individual power plants, such as efficiency improvements. (Power plant efficiency improvements, known as ‘heat rate improvements,’ reduce the energy content of the fossil fuel consumed per unit of electricity generated at power plants.)

The associated carbon reductions are going to be relatively small compared to what could be achieved through a power sector-wide approach—including bringing on line cleaner generation resources, increasing demand-side energy efficiency and allowing market-based trading—as was adopted in the original Clean Power Plan. For the final CPP, the EPA estimated that on average nationally a fleet-wide heat rate improvement of approximately 4 percent was feasible, which would result in a fleet-wide CO2 reduction of about 62 million tons in a year. (For context, US power sector CO2 emissions in 2016 were 1,821 million metric tons)

2. Will it promote renewable energy while heading off an over reliance on natural gas?

An approach that’s limited to carbon reductions at current fossil-fired power plants will miss one of the biggest opportunities to lower power sector emissions: ramp up cheap renewable energy!

The original CPP explicitly called out a role for renewable energy in helping to cost-effectively bring down carbon emissions. UCS analysis shows how boosting renewable energy can help cut emissions affordably while bringing consumer and health benefits. Simply switching from coal to gas, while it does lower carbon emissions at the power plant, is just not going to be enough to achieve the deep cuts in power sector emissions we ultimately need from a climate perspective. Boosting the contribution from renewable energy can help limit the climate, economic and health risks of an overreliance on natural gas.

3. Will it lowball the harms posed by climate change?

Administrator Pruitt seems to understand that legally the EPA is required to regulate carbon emissions and he cannot simply do away with the CPP without replacing it. But will the new plan actually recognize the magnitude of the damages that climate change poses?

Earlier this year, President Trump also issued an executive order undercutting the use of the social cost of carbon (SCC),which measures the costs of climate change (and the benefits of cutting carbon emissions). The SCC served as a proxy for measuring the dollar benefits of carbon reductions from the original CPP. If the re-proposed CPP uses an artificially low SCC, that would fly in the face of the latest science and economics.

4. Will it actually help coal miners get their jobs back?

Not very likely, a fact that even coal company executive Robert Murray and Senator Mitch McConnell have admitted. Market trends are continuing to drive a historic transition away from coal-fired power that is unlikely to change just by getting rid of the CPP.

If the Trump administration and Congress are serious about helping coal miners and coal mining communities, they should invest in real solutions—worker training, economic diversification and other types of targeted resources—to help these communities thrive in a clean energy economy, as my colleague Jeremy Richardson writes.

5. Will it increase pollution?

If the revised proposal attempts to maintain or increase the amount of coal-fired power, that will lead to more air, water and toxic pollution.

In addition to being a major source of carbon emissions, coal-fired power plants are a leading source of emissions of nitrogen oxides, sulfur dioxide, particulate matter, and mercury, among other types of harmful pollution. These pollutants cause or exacerbate heart and lung diseases and can even lead to death. Mercury can affect the neurological development of babies in utero and young children. The Clean Power Plan would have delivered significant health benefits through reductions in these co-pollutants.

Clean energy momentum will continue

Despite Administrator Pruitt’s attempts to undermine the CPP, clean energy momentum will continue nationwide. The facts on the ground are rapidly changing. Market trends continue to drive down coal-fired power because coal is an increasingly uncompetitive option compared to cleaner options like natural gas and renewable energy.

That’s why Xcel CEO Benjamin Fawke recently said “I’m not going to build new coal plants in today’s environment.” And “We’re investing big in wind because of the tremendous economic value it brings to our customers.”

It’s why Appalachian Power’s Chris Beam also said,

At the end of the day, West Virginia may not require us to be clean, but our customers are (…) So if we want to bring in those jobs, and those are good jobs, those are good-paying jobs that support our universities because they hire our engineers, they have requirements now, and we have to be mindful of what our customers want. We’re not going to build any more coal plants. That’s not going to happen.

The pace of growth in renewable energy growth is particularly striking, with new wind and solar installations outstripping that of any other source of power including natural gas.

And as my colleague Julie McNamara recently pointed out, energy efficiency is one of the top electricity resources in the US, and in fact was the third-largest electricity resource in the United States in 2015.

That’s why more and more states, cities and businesses are doubling down on their commitment to renewable energy and the goals of the Paris Climate Agreement, saying ‘We’re Still In!’

For all of you who care deeply about our nation’s transition to clean energy, please ask your state legislators to push for more renewable energy even as the Trump administration tries to turn back progress.

We still need robust federal policies

Despite the promising market trends, there’s no denying we need robust federal policies to accelerate the current clean energy momentum and cut US carbon emissions faster and deeper to meet climate goals.

The reality is that the original CPP itself was not strong enough, though it was a pivotal step in the right direction. The US will need to do more, both in the power sector and economy-wide to cut emissions in line with the goals of the Paris Agreement.

A weakened CPP would be a sad step back in our efforts to address global warming. At a time when the risks of climate change are abundantly clear—just consider this years’ terrible hurricane and wildfire seasons—this is no time to delay action.

Administrator Pruitt: Do your job

Mr. Pruitt continues to show a blatant disregard for the mission of the agency he heads, while pandering to fossil fuel and other industry interests. Weakening the power plant carbon standards is just the latest in a long string of actions he has taken to undermine public health safeguards that were developed in accordance with laws Congress has passed.

Furthermore, he has repeatedly attacked the role of science in informing public policy. Perhaps most egregiously, he continues to deny the facts on climate change. (If he is genuinely interested in understanding the latest science, he need look no further than the US National Academy of Sciences.)

Administrator Pruitt, stop hurting our children’s health and future. Do your job—and start by setting strong carbon standards for power plants.

 

Photo: justice.gov

Happy 40th, SNAP! Celebrating Four Decades of Effective Nutrition Assistance

Happy birthday to the Supplemental Nutrition Assistance Program as we know it!

SNAP, it’s hard to believe it was only 40 years ago that President Carter made you into a better, stronger safety net by signing the Food Stamp Act of 1977. Of course, you’re grown now, and you know it takes more than one person to make a law. You were really born out of the hard work and bipartisanship of Senators George McGovern and Bob Dole—two legislators who loved effective anti-hunger legislation very, very much, and who improved the Food Stamp Act of 1964 by eliminating required payments for food stamp users and fine-tuning eligibility.

Naturally, some things have changed over 40 years

Like your name. You went through that phase where everybody called you “food stamps,” and we supported you, but “SNAP” really does suit you better.

You’ve also seen a host of changes come and go related to program eligibility, work requirements, and nutrition education funding—many of which continue to be subjects of debate.

And technology keeps barreling forward. You’ve seen the amazing things it can do—watching as schools handily adopt data matching technologies you’d never dreamed of having—and some days you feel like you’re getting the hang of it, like when you finally transitioned from paper stamps to an electronic benefit system. (Other days you’re calling your daughter-in-law because you once saw her set up a Roku in ten minutes and boy could you use her help with this.)

But some things have stayed the same

You’ve been there for the American people, unfailingly, through all the ups and downs of economic recovery and recession, changes in administration and leadership, and even that time Representative Steve King said that mean and totally untrue thing about you right to your face. (Sorry again. No one likes him, if that makes you feel better.)

You were there when the 2008 recession hit and 2.6 million Americans lost their jobs—many unexpectedly—and in the years that followed, as “middle-class” jobs became harder and harder to come by and people really needed you for a while.

And even now, amid the devastation of hurricanes and flooding, you are providing food to those who desperately need it through the Disaster Supplemental Nutrition Assistance Program.

Despite what people say, you’re not just a program for “the poor.” You’re a program for all of us, because we are all vulnerable to the unexpected, economic crises and natural disasters included, and you understand that.

The best thing about getting older?

Take it from an organization that hit 40 a few years ago—the best thing about getting another year older is realizing that the people you’ve supported, through thick and thin, are here to support you too.

And one of the best things about the farm bill is that it gives us a chance to do just that.

On behalf of the 21 million American households you serve, and the millions more who know you’ll be there when they need you: Happy Birthday, SNAP.

Will Scott Pruitt Tap Polluter-Friendly Scientists for Key Advisory Panel?

Photo: Wikimedia

A third of the Environmental Protection Agency’s Science Advisory Board, an influential panel that reviews the science the agency uses in formulating safeguards, could be succeeded by climate science-denying, polluter-friendly replacements when their terms expire at the end of this month.

The board, which has been in existence for nearly 40 years, is traditionally populated by bona fide scientists from academia, government, and industry who volunteer to serve three-year terms. This time around, as first reported by E&E News, at least a dozen of the 132 candidates vying for one of the 15 open seats reject mainstream climate science. But that’s not all. There are at least 10 other equally inappropriate candidates on the list, and not all of them are scientists, despite the fact that it’s supposed to be a panel of science advisers.

Among the 12 climate science deniers are Weather Channel co-founder Joseph D’Aleo, who wrongly claims global warming is due to natural oceanic, solar, and volcanic cycles; and former Peabody Energy science director Craig Idso, now chairman of his family’s Center for the Study of Carbon Dioxide and Global Change, who insists “there is no compelling reason to believe that the rise in [average earth] temperature was caused by the rise in carbon dioxide.” D’Aleo, Idso, and six of the other climate-fact-challenged candidates are affiliated with the fossil fuel industry-funded Heartland Institute, which has a long history of misrepresenting science.

The other 10 unsuitable candidates consistently side with industry when it comes to protecting the public from toxic hazards, regardless of the scientific evidence, and falsely accuse the EPA of being unscientific to try to undermine its credibility.

Soot makes you live longer

One of the 10, toxicologist Michael Honeycutt, failed to secure a seat on the EPA’s seven-member Clean Air Scientific Advisory Committee when he was nominated for one last fall—with good reason. Over the last decade, Honeycutt, who heads the toxicology division of the Texas Commission on Environmental Quality, rolled back the state’s relatively weak protections for 45 toxic chemicals, including arsenic, benzene, formaldehyde, and hexavalent chromium, the carcinogen that made Erin Brockovich a household name.

Honeycutt also has attacked EPA rules for ground-level ozone (smog), which aggravates lung diseases, and particulate matter (PM) (soot), which has been linked to lung cancer, cardiovascular damage, reproductive problems, and premature death. In October 2014, Honeycutt argued that there would be “little to no public health benefit from lowering the current [ozone] standard” because “most people spend more than 90 percent of their time indoors” and “systems such as air conditioning remove it from indoor air.” And despite the overwhelming scientific evidence directly linking fine soot particles to premature death, Honeycutt testified before Congress in June 2012 that “some studies even suggest PM makes you live longer.”

Better living through chemistry

Another industry-friendly nominee, Kimberly White, is senior director of chemical products at the American Chemistry Council (ACC), the country’s largest chemical manufacturing trade association. Representing the interests of 155 corporate members, including chemical companies Dow, DuPont, and Olin; pharmaceutical firms Bayer, Eli Lilly, and Merck; and petrochemical conglomerates BP, ExxonMobil, and Shell, the ACC has delayed, weakened, and blocked science-based health, environmental, and workplace protections at the state, national, and even international levels.

For example, the ACC has lobbied against establishing federal rules on silica dust exposure and disclosing the chemicals used in hydraulic fracturing. It has been instrumental in limiting community access to information about local chemical plants. And it has played a key role in quashing government efforts to regulate bisphenol A (BPA), an endocrine-disrupting chemical used in plastics and can linings; flame retardants, which have been linked to birth defects and cancer; and formaldehyde, a known carcinogen. White downplayed formaldehyde’s risks in a September 2016 blog on the ACC website.

The ACC also lobbies to weaken existing environmental safeguards. In written testimony for a House Science, Space and Technology Committee hearing last February, for example, White charged that the EPA uses irrelevant or outdated data and procedures when drafting new regulations.

Who needs a cleaner environment?

Finally, three of the pro-polluter candidates are economists with a distinct corporate tilt: Richard Belzer, whose clients include the American Chemistry Council and ExxonMobil Biomedical Sciences; Tony Cox, whose clients include the America Petroleum Institute, Chemical Manufacturers Association, and Monsanto; and John D. Graham, dean of Indiana University’s School of Public and Environmental Affairs, who is currently doing contract work for the Alliance of Automobile Manufacturers on fuel economy standards and the libertarian Searle Freedom Trust on regulatory “reform.” All three emphasize the cost to industry to reduce pollution, discount scientific evidence of the risk of exposure, and ignore the benefits of a cleaner environment.

Perhaps the best known is Graham, who ran the Office of Management and Budget’s (OMB) Office of Information and Regulatory Affairs (OIRA) for five years during the George W. Bush administration. His appointment to that position was hotly contested because in his previous job, directing the Harvard Center for Risk Analysis, he routinely understated the dangers of products manufactured by the center’s corporate sponsors by using questionable cost-benefit analyses.

As predicted, Graham applied that same simplistic, industry-friendly calculus at OIRA, which oversees all government rulemaking, and at the tail end of his tenure in 2006, he unsuccessfully attempted to standardize risk assessments across all federal agencies. Public interest groups and the scientific community, spearheaded by the American Association for the Advancement of Science, came out in full force against the idea, and a National Research Council (NRC) committee unanimously rejected it as “fundamentally flawed.”

“Economists like Graham are frustrated because the EPA has been conservative about risk,” said Center for Progressive Reform co-founder Rena Steinzor, who wrote a stinging indictment of Graham’s government-wide proposal in a May 2006 issue of Inside EPA’s Risk Policy Report. “The EPA gives more margin to safety. That drives economists crazy. They think it leads to over-protection. But there are not many examples of chemicals that turn out to be less harmful than we thought.”

Foxes advising the foxes in the henhouse?

Putting climate science deniers and industry apologists on the EPA Science Advisory Board (SAB) would not only undercut the panel’s legitimacy, it also would provide cover for the corporate shills now in key positions at the agency, starting with Administrator Scott Pruitt, who has the final say on who is selected, and Nancy Beck, a deputy assistant administrator who most recently worked for the American Chemistry Council, and before that, for Graham at OMB.

“The Science Advisory Board has been providing independent advice to the EPA for decades, ensuring that the agency uses the best science to protect public health and the environment,” said Genna Reed, a policy analyst at the Union of Concerned Scientists. “SAB members have always been eminent scientists who are committed to the often-challenging public service of working through complex scientific topics to help guide EPA decision-making. They are the EPA’s scientific compass. The agency’s mission to safeguard our air and water will be further compromised if Administrator Pruitt winds up selecting these unacceptable candidates.”

Get involved! Submit a comment to EPA by Thursday!

You can submit comments about the EPA Scientific Advisory Board nominees by email to Designated Federal Officer Thomas Carpenter no later than close of business on Thursday, September 28, at carpenter.thomas@epa.gov. (Note that public comments are subject to release under the Freedom of Information Act.)

Tell the EPA that the following candidates are unacceptable for the Science Advisory Board:

Climate-science-denier nominees: Edwin Berry, Alan Carlin, Joseph D’Aleo, Keven Dayaratna, Paul Dreissen, Gordon Fulks, Craig Idso, Richard Keen, David LeGates, Anthony Lupo, David Stevenson and H. Leighton Steward.

Pro-polluter nominees: Richard Belzer, James Bus, Samuel Cohen, Tony Cox, James Enstrom, John D. Graham, Michael Honeycutt, Walt Hufford, James Klaunig and Kimberly White.

Who Not to Pick for the EPA’s Science Advisory Board

In its effort to fill fifteen positions on the Science Advisory Board, the EPA has posted a list of 132 nominees to be a part of the esteemed EPA Science Advisory Board (SAB). The SAB is a group of over forty scientists, experts in a range of disciplines, who provide peer review and expert advice on EPA issue areas.

While many of the nominees are highly qualified and distinguished in their fields, there are a handful of individuals that are extremely concerning due to their direct financial conflicts, their lack of experience and/or their historical opposition to the work of the EPA in advancing its mission to protect public health and the environment.

The SAB was established by the Environmental Research, Development, and Demonstration Authorization Act of 1978 and operates as a federal advisory committee under the Federal Advisory Committee Act of 1972. Note that board members should be experts in their fields, with the training and experience to evaluate EPA-relevant scientific and technical matters. Source: U.S. GPO

Many of these concerning individuals were nominated by Heartland Institute—an organization that has actively worked to sow doubt about climate change science—and have the seal of approval by Trump EPA transition team member and Heartland staffer, Steve Milloy. When interviewed about some of the names on the nominee list, Milloy said that he is glad that EPA administrator Scott Pruitt is in office since he’ll be brave enough to reconstitute the SAB. A “thumbs up” from Milloy is an immediate red flag for me.

My colleague, Andrew Rosenberg, categorized questionable political appointees in three distinct buckets: the conflicted, the opposed, and the unqualified. The same can be said of nominees for the SAB. You don’t have to dig too deep to find individuals who may appear to be qualified on paper, but have a track record of undermining the work of the EPA and advancing policies that benefit special interests over the general public. Appointing these individuals to the SAB would be in direct opposition to the critical work of the SAB itself and to the EPA’s mission.

Take Dr. Michael Honeycutt, lead toxicologist at the Texas Commission on Environmental Quality, for example. Industry representatives, including at the American Chemistry Council, ExxonMobil, and the Texas Oil and Gas Association launched a campaign to get Honeycutt appointed to the CASAC in 2016, which fortunately was unsuccessful. Now Honeycutt’s name is on the list for the SAB.

He co-authored an article in 2015 that argued that available science did not support the EPA’s assertion that tighter ozone standards would provide significant public health benefits. In criticizing the scientific studies used by the EPA, Honeycutt has cherrypicked studies to exaggerate uncertainty on risks of ozone pollution, including making hay of the argument that ozone pollution isn’t a huge issue because “most people spend more than 90 percent of their time indoors,” which has been picked up and spouted off by climate deniers, like Michael Fumento.

Honeycutt has also served on the steering committee of the Alliance for Risk Assessment (ARA), along with President Trump’s nominee to head the Office of Chemical Safety and Pollution Prevention, Michael Dourson. The ARA was created by the TERA, an organization founded by Dourson that does research for industry and maintains a database of risk assessments.

According to its website, about a third of TERA’s funding comes from the private sector, including the American Chemistry Council and Coca-Cola. Rena Steinzor, professor at the University of Maryland School of Law has accused TERA of “whitewashing the work of industry.” The TCEQ has awarded TERA at least $700,000 in contracts between 2010 and 2014.  As a steering committee member, Honeycutt oversaw ARA scientific reviews of TCEQ work. While Honeycutt claims that he recused himself from those projects, the quagmire of ties between TCEQ, ARA, and TERA are hard to dispute, especially when you consider that during those same years, the TCEQ loosened two-thirds of the already-weak protections for the 45 chemicals it chose to reassess between 2007 and 2014. In 2013, The TCEQ paid $1.6 million to another industry-friendly consulting firm, Gradient, to review EPA’s science on ozone.

Honeycutt has spent his career at TCEQ politicizing the EPA and actively working to obstruct science used to inform important standards at the agency, so it seems out of character for him to want so badly to be a member of an EPA science advisory committee. Unless, of course, he is interested in the platform or the ability to provide formal advice to his personal friend, Michael Dourson.

What does Honeycutt have in common with fellow nominee, Dr. John Graham? Under Graham’s leadership in January 2006, The White House Office of Management and Budget (OMB) released a proposed Risk Assessment Bulletin which would have covered any scientific or technical document assessing human health or environmental risks.

OMB asked the National Academy of Sciences’ National Research Council (NRC) to conduct an independent review of the document. Its study gave the OMB a failing grade, calling the guidance a “fundamentally flawed” document which, if implemented, would have a high potential for negative impacts on the practice of risk assessment in the federal government. Among the reasons for their conclusions was that the bulletin oversimplified the degree of uncertainty that agencies must factor into all of their evaluations of risk. This idea for standardized risk assessment is of interest to regulatory reform advocates like Graham and has made its way into the dangerous Regulatory Accountability Act in Congress and into the new toxic substances rules under the Frank Lautenberg Chemical Safety for the 21st Century Act that Graham’s protégé and former ACC staffer, Nancy Beck, is now crafting from her position as Deputy Assistant Administrator of the EPA.

Before his stint at OMB, Graham led the Harvard Center for Risk Analysis, which notably skewed risk analyses in favor of industry: costs saved by not regulating versus lives saved regulating. In one case, Graham’s OMB rejected a National Highway Transportation Safety Administration rule that would reduce the toll of vehicle rollovers by requiring that automakers install tire pressure warning systems. Graham made this decision despite the direct conflict of interest as his Harvard think tank was funded by General Motors Corp., Ford Motor Co., Volvo Car Corp. and the Alliance of Automobile Manufacturers.

Another individual that the SAB should steer clear of is Dr. Richard Belzer, an agricultural economist and, like Graham, is a cost-benefit-analysis enthusiast who worked for the OMB’s Office of Information and Regulatory Affairs (OIRA) from 1988 to 1998. In 2000, Belzer criticized SAB’s role in peer reviewing the EPA’s evaluation of costs and benefits of the Clean Air Act. Belzer and his co-author called SAB’s reviews “ineffective” because, in their opinion, they couldn’t force the agency to change the direction of policy.

Belzer appears to misunderstand the purpose of the SAB which is to simply advise the agency on its science. The EPA has the discretion to heed that advice and apply it to policies. SAB members are not decision-makers, they are esteemed scientists whose expertise is best suited to evaluate scientific considerations, not political ones. In 2010, Belzer participated in a panel on “The EPA’s Ambitious Regulatory Agenda” sponsored by the American Enterprise Institute, the description of which includes the erroneous statement: “all major EPA decisions are contentious.” According to his bio, his clients include ExxonMobil and American Chemistry Council. And speaking of American Chemistry Council…

Kimberly White, senior director of chemical products and technology at the ACC is among those nominated to serve on the SAB. She has been summoned by House Science Committee Majority Staff, Lamar Smith, to testify at the hearing called “Making EPA Great Again” earlier this year where she spoke about the need to improve the SAB’s transparency and peer review methods and accused the EPA of being too involved in the SAB’s peer review process: “conversations that are happening in that peer review get stymied by [the] EPA’s input during the peer-review process so it’s not as independent as it should be.”

She also agreed when one member of congress suggested that the SAB was not truly balanced and that there should be a devil’s advocate on the committee. Perhaps Dr. White wants to fill that very role. The problem with that, however, is that the American Chemistry Council and her previous employer, the American Petroleum Institute, are organizations that actively work to spread disinformation about a range of scientific topics to thwart the EPA’s work to keep us safe. Dr. White has criticized an EPA assessment on formaldehyde, for example, because it wasn’t inclusive enough of science. Formaldehyde is a known carcinogen and thanks in large part to ACC, the EPA’s emissions standard for wood products set to be enforced in December has been delayed at least four months.

Who Pruitt appoints for the fifteen open positions will be a test to see whether he is going to continue seeking exclusive counsel from polluters. There are a handful of qualified scientists who have only served one term that can be easily reappointed for a second, which is common practice for the board. For the sake of continuity, it would behoove Pruitt to keep those experts on. For the other positions, it would be in the agency’s best interest for Pruitt to choose a balanced roster of new members from the dozens of well-qualified scientists on the list, rather than stack the committee with folks who have spent their careers working to undermine the mission of the EPA and weaken policies that are supposed to keep us safe.

All members of the public can submit comments encouraging the EPA to appoint independent and qualified scientists as advisors. You have until Thursday, September 28th at 11:59pm to email your comment to Thomas Carpenter, the Designated Federal Officer of the SAB, at carpenter.thomas@epa.gov.

 

 

Illinois is Expanding Solar Access to Low-Income Communities—But It Didn’t Happen Without a Fight

Installing solar panels in PA Photo: used with permission from publicsource.org

When the Future Energy Jobs Bill (FEJA) passed the Illinois General Assembly and was later approved by Governor Rauner in early December last year, a key component of the legislation was to expand solar access for low-income communities. To get a feeling for how the legislation came about, I caught up with Naomi Davis, president and founder of the Chicago-based non-profit Blacks in Green (BIG). She has been on the front lines of developing this innovative program and is excited to finally see it coming together.

Illinois Solar for All

The Illinois Solar for All Program, a key piece of FEJA, provides funding to train and employ residents of low-income and economically disadvantaged communities, residents returning from the criminal justice system, and foster care graduates, in the solar installation industry. It’s a comprehensive solar deployment and job training program that will open access to the solar economy for thousands of Illinois residents.

For Naomi Davis, who has been advocating for renewable energy in a variety of platforms since BIG’s founding 10 years ago, Solar for All is a dream come true.

“[Solar For All] means the realization of a fundamental aim of BIG, which is to build an earned income business model for our non-profit,” Davis says. “We are launching BIG SOLAR in partnership with Millennium Solar and SunSwarm and creating a social enterprise for education and outreach, household subscriptions, workforce training and placement, design, installation, and maintenance of systems – residential, commercial, industrial, and are also exploring the development of a light solar pv assembly facility in West Woodlawn.”

The Solar for All program is a solar deployment and and job training initiative under FEJA.

The path to solar

The path to solar for all hasn’t been easy. “Not talked about is the sausage-making chaos of building a market almost from scratch, and the incredibly detailed and exhaustive examination of details and scenarios required,” admits Davis. She shares the camaraderie created when “folks who never talk to each other are huddled over time to understand the roles of the other and how to create economic harmony. That tiny organizations like BIG have to carry an incredible weight to stay at that table and ensure the interests of our constituents are represented.”

Although the legislation passed with many having a hand in its success, she highlights that communities of color are the unsung heroes of this legislation. Her organization’s affiliation and membership with the Chicago Environmental Justice Network was pivotal in having their needs considered. Among the organizations part of the network is the Little Village Environmental Justice Organization (LVEJO).

Juliana Pino, Policy Director for LVEJO, made sure the direction and content of the Future Energy Jobs Act took into consideration the needs of their community. It’s through their work, Davis says, that many of the benefits to communities of color now will be realized.

Solar growth benefits communities

According to the Low Income Solar Policy Guide, the growth of solar in the United States provides a significant opportunity to address some of the greatest challenges faced by lower-income communities: the high cost of housing, unemployment, and pollution. Solar can provide long-term financial relief to families struggling with high and unpredictable energy costs, living-wage employment opportunities in an industry adding jobs at a rate of 20 percent per year, and a source of clean, local energy sited in communities that have been disproportionately impacted by fossil fuel power generation.

According to Davis, Chris Williams, owner of Millennium Solar Electric, should be funded through this training. Davis says Williams is a third-generation African American IBEW electrician and founder of the now-reviving South Suburban Renewable Energy Association and go-to ComEd solar youth educator. Training and education are key.

Still, the work is hardly over. In fact, it’s just begun.

“As with any industry poised for enormous market share – in this case, energy – strategic tech training is essential,” says Davis. “Not just African Americans historically discriminated against, but also coal region towns desperately need the re-education this legislation can provide. Market forces are already finding cheaper sources than coal and without public dollars. Coal towns across Illinois and around the country all need what Solar for All provides – a better way forward.”

Community partnerships

Under the Illinois Solar for All Program, developers of community solar projects need to identify partnerships with community stakeholders to determine location, development, and participation in the projects. Communities will play a pivotal role in this program, and continuing to build partnerships is critical to its success.

Thanks to the Illinois Solar for All Program, Illinois is poised to bring more solar power to homes, communities, places of faith, and schools in every part of the state.

Public Source

Mesothelioma Awareness Day: Our Past Must Dictate the Future

It shouldn’t come as a surprise that asbestos isn’t good for you. The mineral is a known carcinogen and has been tied to thousands of deaths from mesothelioma, asbestosis, and other asbestos-related diseases. On average, close to 3,000 people each year in the United States are diagnosed with mesothelioma. And for those unfortunate enough to be diagnosed with the incredibly rare disease, the results are often not good. Patients are usually given a grim prognosis averaging somewhere between 12 and 21 months.

Asbestos-related diseases are rarely quick to present themselves, often taking decades before symptoms finally show. When you breathe in or accidentally ingest the invisible fibers, they enter the lungs and may lodge themselves deep into the lung lining, known as the mesothelium. The area becomes irritated and over the years tumors begin to form. Mesothelioma is often difficult to diagnose, which means the resulting cancer is caught later and treatment options are more limited.

Breaking down barriers

Armed with that kind of information, one would assume it’d be a slam dunk to phase out asbestos use in the United States. Unfortunately, that isn’t the case. Last year, roughly 340 tons of raw asbestos were imported into the US, primarily for use in the chlor-alkali industry. Some types of asbestos-containing materials can also be imported. The Environmental Protection Agency tried to ban asbestos use nearly three decades ago, but many of the rules established by the department were overturned in a resulting court decision two years later. Today there’s hope things could change in the coming years, including renewed interest from the EPA.

In 2016, Congress approved the Frank R. Lautenberg Chemical Safety for the 21st Century Act, amending the 40-year-old Toxic Substances Control Act (TSCA) and giving the EPA more power to regulate dangerous chemicals as they are introduced in an effort to more effectively remove those posing an unnecessary risk to public health. Chemicals deemed to pose an unreasonable risk during the evaluation process will be eliminated based on safety standards, as opposed to a risk-benefit balancing standard used under the previous TSCA requirements. What this means is that under the old TSCA, an unreasonable risk would require a cost-benefit analysis and any restrictions would have to be the least burdensome to addressing the risk. Under the Lautenberg Act, the “least burdensome” requirement is removed, though the EPA still needs to take costs of other regulatory actions and feasible alternatives into consideration.

The amendment also requires the agency to perform ongoing evaluations of chemicals to determine their risk to public health. In December, asbestos was included on a list of ten priority chemicals slated for evaluation and a scoping document for the mineral was issued in June. Problem formulation documents for each of the first ten chemicals are expected in December.

Drowning in red tape

Despite what the Lautenberg Act is doing to unshackle the EPA and allow it to properly regulate chemicals as it sees fit, the White House and Congress have taken actions that seem counterintuitive. For example, in January, President Donald Trump signed an executive order known as the “2-for-1 Order” forcing agencies to remove two existing rules for every new one they create. The risk here is that agencies like the EPA will have to pick which rules to enforce, creating a new series of public health concerns. When it comes to new hazards, the agency may be slower to react due to a new budget variable thrown into the mix. While it could help the agency identify rules that overlap others, it does create the risk of money taking precedence over public health.

In addition, the Senate’s recently introduced Regulatory Accountability Act, known in some circles as the ”License to Kill” Bill, poses a similar set of issues. If passed, the RAA could potentially resurrect much of the red tape that was removed by the Lautenberg Act. Once again, it would become difficult to regulate or ban chemicals in the future, despite dangers they may propose. For example, the EPA would have to prove that a full asbestos ban is the best option available to the agency compared to any other more cost-effective option. It also allows for anyone to challenge these decisions, which could delay a potential ruling for years or even halt the process entirely.

The EPA is also constrained by the people who have been appointed to several high level positions within the agency itself. Administrator Scott Pruitt sued the EPA 14 times, challenging rules he believes overstepped the agency’s boundaries. Deputy Assistant Administrator Nancy Beck, previously with the American Chemistry Council, lobbied for years against the very rules she has sworn to protect today. In 2009, Beck was criticized in a House report for attempting to undermine and create uncertainty regarding the EPA’s chemical evaluations while serving with the Office of Budget and Management for the Bush administration. The latest person nominated for an EPA position is Mike Dourson, who has, at times, proposed much less protective standards for chemicals than those in use by the federal government.

Where we stand now 

This Mesothelioma Awareness Day, we find ourselves one step closer to seeing asbestos banned in the US. Today, while we honor those who’ve lost their struggle against this disease, we also show support for those still fighting mesothelioma and refusing to give in.

The EPA has, once again, taken the first steps toward a potential ban, but until that day comes the need for more awareness is a never-ending battle. Mesothelioma is a misunderstood disease and asbestos isn’t something people might consider at work or at home, which is why educating others is so important. Mesothelioma is largely avoidable, but the need to remain vigilant to prevent exposure is paramount.

Asbestos exposure isn’t something that will come to a screeching halt overnight. Hundreds of thousands of homes, buildings, and schools still harbor the mineral and that is likely to be the case for years to come. But stopping the flow of raw and imported asbestos into the US is a great first step to combating the issue at large.

About the author: Charles MacGregor is a health advocate specializing in education and awareness initiatives regarding mesothelioma and asbestos exposure. To follow along with the Mesothelioma Cancer Alliance and participate in a MAD Twitter chat on September 26, find them at @CancerAlliance

Rebuilding Puerto Rico’s Devastated Electricity System

Photo: endi.com

Over the last few days, I’ve been glued to social media, the phone, and ham radio-like apps trying to find out more about the fate of family members in the catastrophic situation in my native Puerto Rico following Hurricane María. (Fortunately, I was able to confirm on Friday that everyone in my immediate family is accounted for and safe).

My family is among the few lucky ones. My childhood home is a cement suburban dwelling built on well-drained hilly soils, some eight kilometers from the coast, and well outside flood zones. But many of my 3.4 million co-nationals in Puerto Rico have not been so lucky, and are experiencing, as I write this, catastrophic flooding. Further, tens of thousands have been without electricity since Hurricane Irma downed many of the distribution lines. In addition, there are more than 170,00 affected in the nearby US Virgin Islands and Dominica, Caribbean islands who have also experienced catastrophic damages.

Just in the largest suburban community in Puerto Rico—Levittown in the north—hundreds had to be evacuated on short notice during the early Thursday dawn as the gates of the Lago La Plata reservoir were opened and the alarm sirens failed to warn the population. The next day, a truly dramatic emergency evacuation operation followed as the Guajataca Dam in the northwest broke and 70,000 were urged to leave the area. At least ten have been confirmed dead so far.

The government of the Commonwealth has mounted a commendable response, but has been hampered in large part by the lack of power and communications facilities, which are inoperable at the moment except for those persons, agencies, and telephone companies that have power generators and the gas to keep them running. This has been one of the main impediments for Puerto Ricans abroad to communicate with loved ones and for the Rosselló administration’s efforts to establish communications and coordination with many towns that remain unaccounted for.

Chronic underinvestment and neglect of energy infrastructure increases human vulnerability to extreme weather

Why has Puerto Rico’s energy infrastructure been rendered so vulnerable in the recent weeks? The ferocity of Irma and María could stretch the capacity of even well-funded and maintained energy production and distribution systems. In Florida—where the power grid had received billions in upgrades over the last decade—Irma left two-thirds of the population without power (but was also able to bounce back after a few weeks).

But years of severe infrastructure underinvestment by the Puerto Rico Electric Power Authority (PREPA) has led to a fragile system that has completely collapsed after these two hurricanes. Irma’s indirect hit damaged distribution lines but not production; María’s eye made landfall on the southeast and exited through the central north, placing it right on the path of four of the high-capacity plants that burn heavy fuel and diesel oil. These plants are also located close to, or within, flood zones.

The reconstruction of the power infrastructure in Puerto Rico is a monumental task as it is critical to guarantee the well-being of Puerto Ricans. More than 3.4 million US citizens are now in a life-threatening situation and getting electricity up and running in the near term is critically important as it can support rescue and recovery efforts.

Wherever possible, these immediate efforts should aim to align with a broader rebuilding mission that points Puerto Rico toward a more economically robust and climate resilient future, not repairs that repeat the mistakes of the past. There is a need also to build resilience against the climate and extreme weather vulnerability Puerto Rico is so brutally facing right now.

There is a great need also for economic alleviation of the high cost of energy in Puerto Rico: electricity prices for all sectors (residential, commercial, and industrial) are much higher in Puerto Rico than in the United States. Reliance on imported fossil fuels for generation is one driver of the high cost: in 2016 nearly half of energy production came from petroleum, nearly one-third from natural gas, and 17 percent coal). Only 2 percent comes from renewables.

While there is quite a bit of clean energy momentum in the United States, that impetus is not being transferred to Puerto Rico. There are many reasons for that, including lack of support from PREPA. But Puerto Rico has strong solar and wind energy resource potential, and renewable energy has been proposed as a way to help PREPA pare down its $9 billion dollar debt, help reduce reliance on fossil fuels and fossil fuel price volatility, lower costs to consumers, and contribute to an economic recovery for the Commonwealth.

This unprecedented catastrophe affecting millions of US citizens requires the intervention of the federal government

To ensure a safe and just economic recovery for Puerto Rico, Congress and the administration need to commit resources to help the territory recover. President Trump has declared Puerto Rico a disaster zone, and FEMA director Brock Long will visit the island on Monday. The priority right now is to save lives and restore basic services. To aid these efforts, Congress and the Trump administration should:

  • Direct the Department of Defense to provide helicopters and other emergency and rescue resources to Puerto Rico.
  • Provide an emergency spending package to the US territory.
  • Increase the FEMA funding level for debris removal and emergency protective measures in Puerto Rico.
  • Temporarily suspend the Jones Act. The Jones Act, which mandates that all vessels carrying cargo into the US and its territories be US Merchant Marine vessels, significantly increases the cost of importing goods into the island.

Once the state of emergency ends, Governor Rosselló needs to be very vocal that Puerto Rico’s energy infrastructure reconstruction should help put the Puerto Rican people and economy on a path to prosperity and resilience from climate impacts. The 2017 hurricane season is not over yet, and the situation in Puerto Rico right now is catastrophic. Decisions about energy infrastructure will be made in the coming days, weeks, and months. Those decisions need to take into account the short- as well as the long-term needs of the Puerto Rican population and help make Puerto Rico more resilient to the massive climate and weather extreme dislocations that we are facing.

Want to help?

endi.com

Science Triumphs Over Disinformation in Initial Flame Retardant Victory

In a stunning victory for consumer safety and a powerful display of the ability of independent science to spur policy change, the Consumer Product Safety Commission (CPSC) voted this week to ban a class of additive, polymeric organhalogen flame retardants (OFRs) that are present in many consumer products. Last week, I was one of many individuals who testified before the CPSC urging the body to grant a petition to ban the class of organohalogen flame retardants from four classes of consumer products: mattresses, children’s products, furniture, and electronic casings.

Of the 31 individuals who testified last week, there were only two individuals who advised the CPSC not to ban OFRs: representatives from the American Chemistry Council (ACC) and the Information Technology Industry Council. As Commissioner Marietta Robinson pointed out during the hearing, the only comments opposing the ban “represent those with a financial interest in continuing to have these potentially toxic, and some of them definitively, toxic, chemicals in our environment.”  She also noted that the presentations by those opposed to the petition were not transparent and used materials relating to chemicals that were irrelevant to the petition, a drastic contrast to the numerous scientists and scholars whose heavily footnoted statements provided evidence to support the arguments of the well-bounded petition.

Scientific information trumps corporate disinformation

Commissioner Robert Adler, who submitted the motion to grant the petition, compared the chemical industry’s talking points at the hearing on reasons not to ban OFRS to the tobacco industry’s same denial of the health impacts of smoking. His statement read, “if we took the tobacco industry’s word on cigarette safety, we would still be waiting. Similarly, we have waited for years for our friends the chemical industry to provide us with credible evidence that there are safe OFRS. I have little doubt that we will still be waiting for many years, to no avail.” Sadly, he’s probably right.

We have seen this trend time and time again. Whether it was the tobacco industry, the asbestos industry, the sugar industry, the PCB industry, the agrochemical industry, the pharmaceutical industry, and the oil and gas industry, corporate bad actors have known about risks of their products and have chosen not to act to protect the public for years, sometimes decades. Not only do they deny that there is harm, but they actively push for policies that allow them to conceal the truth for even longer. As Oxford University’s Henry Shue wrote about fossil fuel companies like Exxon in a recent Climatic Change article, noting that “companies knowingly violated the most basic principle of ‘do no harm.’” It is unethical and unacceptable that the public is not afforded the information we deserve on the harms of products we are exposed to every day in the air we breathe, the water we drink, the food we eat, and everything in between.

A 2008 EPA literature review on polybrominated diphenyl ethers, one type of OFR, found that 80 percent of total exposure to the chemical by the general population is through ingestion and absorption of house dust containing these chemicals. (Photo: Flickr/Tracy Ducasse)

Case in point: ACC’s statement after the CPSC’s vote included sticking to its talking points and pivoting from whether OFRs are safe to whether they reduce fire risk. During the hearing, the ACC representative argued that the petition was overly broad and that there was insufficient data on each OFR to ban them as a class. However, when asked by Commissioners for evidence that certain OFRs did not cause harm, he was unable to point to a specific chemical or cite relevant research. At a certain point, there is no place to pivot when the facts are stacked against you.

Dust is something I never gave much thought to growing up. If anything, “dusting” was always my favorite chore when faced with the options of vacuuming or washing the dishes. I never really gave much thought to what that elusive substance was composed of. I certainly wouldn’t have guessed that within those seemingly innocuous dust bunnies hiding behind bookshelves were a mix of chemicals that could impact my health. Dusting has taken on new meaning for me since conducting research on flame retardants.

For decades now, consumers have been left powerless and at the whim of manufacturers who have decided for us what chemicals go into our homes and end up in our dust.

The result? Most Americans have at least one type of flame retardant present in our blood, young children have higher levels than their mothers, and children of color and those from low income communities bear disproportionately high levels of these chemicals in addition to a host of other chemical burdens.

Shue writes,

To leave our descendants a livable world is not an act of kindness, generosity, or benevolence…it is merely the honoring of a basic general, negative responsibility not to allow our own pursuits to undercut the pre-conditions for decent societies in the future.

This ban is beyond due. Moving away from these chemicals toward safer alternatives is a win for all, this generation and next.

Product safety is not a political issue

During the vote, Commissioner Adler said that he holds strong to the belief that “product safety is not a partisan issue and should never politicized” after a statement from one of the two Republican Commissioners that granting this petition through a vote down party lines would turn the issue into a political football. Commissioner Robinson defended Adler, stating that she was “absolutely flummoxed” and had “absolutely no clue what the science of this petition and these flame retardants has to do with whether you’re a Democrat or Republican nor what it has to do with my term being potentially up.”  The granting of a petition rooted in rigorous science is not a political action. However, obstructing this science-based rulemaking process would be.

While the CPSC has voted to begin the process of rulemaking to ban OFRs under the Federal Hazardous Substance Act and to convene a Chronic Hazard Advisory Panel, the Commission will be shifting its composition as Marietta Robinson’s term ends in September. It is possible that this scientific issue could become politicized once President Trump nominates a Republican to join the CPSC and take back the majority. In fact, chairwoman Buerkle even suggested that the ban be overruled once the Republicans take back the majority. President Trump intends to nominate corporate lawyer, Dana Baiocco, who has defended companies that have faced charges regarding safety and misleading advertising of consumer and industrial products and medical devices.

We urge the Commission to continue the progress begun during yesterday’s vote to educate the public about the risks of OFRs and to create the policy that will ban these chemicals in consumer products for good. Let’s let science, not politics, have the final word. Our children will thank us someday.

 

 

Eric/Creative Commons (Flickr) Flickr/Tracy Ducasse

Puffins, Politics, and Joyful Doggedness in Maine

Puffins were nearly extinct in Maine in the early 1900s, hunted for their eggs and meat. Their re-introduction to Eastern Egg Rock in Maine in the 1970s became the world's first successful restoration of a seabird to an island where humans killed it off. Photo: Derrick Jackson

Eastern Egg Rock, Maine — Under bejeweled blackness, the lacy string of the Milky Way was gloriously sliced by the International Space Station, the brightest object in the sky. Matthew Dickey, a 21-year-old wildlife and fisheries senior at Texas A&M, grabbed a powerful bird scope and was able to find the space station before it went over the horizon. He shouted: “I think I can make out the shape of the cylinder!”

The space station gone, Dickey and four other young bird researchers settled back down around a campfire fueled with wood from old bird blinds that had been blown out of their misery by a recent storm.

They were alone six miles out to sea on a treeless six-acre jumble of boulders and bramble.

44 years of Project Puffin

On this seemingly inconspicuous speck in Maine waters, a man once as young as they were, Steve Kress, began restoring puffins. He was part of the world’s first successful effort to restore a seabird to an island where they had been killed off by human activity. The experiment began in the spring of 1973 by bringing down 10-day-old chicks down from Newfoundland, feeding them until fledging size in the fall, and hoping that after two or three years out at sea, they would remember Maine and not Canada, where decades of management have maintained a population of about 500,000 pairs.

Tonight it was a celebratory fire, flickering off faces with crescent smiles. Besides Dickey, there was team supervisor Laura Brazier, a 26-year-old science and biology graduate of Loyola University in Maryland and masters degree graduate in wildlife conservation at the University of Dublin in Ireland. There was Alyssa Eby, 24, an environmental biology graduate of the University of Manitoba; Jessie Tutterow, 31, a biology graduate of Guilford College; and Alicia Aztorga-Ornelas, 29, a biology graduate from the Universidad Autonoma de Baja California, Mexico.

In the two days prior, their routine count of burrows with breeding pairs of puffins surpassed the all-time record. The previous mark was 150, set last year. During my four-night stay with them in late July, the count rose from 147 to 157. The summer would end with 173 pairs.

“We did it. We are awesome. You guys are awesome,” Brazier said. “Puffins are cool enough. To know we set a new record and we’re part of puffin history is incredible.”

As the fire roared on, celebration became contemplation. As full of themselves as they had a right to be, they know their record is fragile. Where once there were no more than four puffins left in Maine in 1902, decimated by coastal dwellers for eggs and meat, Kress and 600 interns in the 44 years of Project Puffin have nursed the numbers back to 1,300 pairs on three islands. The techniques used in the project—including the translocation of chicks and the use of decoys, mirrors, and broadcast bird sounds to make birds think they had company—have helped save about 50 other species of birds from Maine to Japan and China. (I have the distinct pleasure of being Kress’s co-author on the story of his quest, “Project Puffin: The Improbable Quest to Bring a Beloved Seabird Back to Egg Rock,” published in 2015 by Yale University Press.)

Interns (Left to right) Alyssa Eby, Matthew Dickey, Alicia Aztorga-Ornelas, and Eastern Egg Rock Supervisor Laura Brazier hold an adult puffin they banded. Also on the team but not pictured is Jessie Tutterow.

In the crosshairs of American politics

But in the last decade, the Atlantic puffin, which breeds in an arc up from Maine and Canada over to Iceland, Scandinavia, and the United Kingdom, has become a signal species of fisheries management and climate change.

On the positive side, Maine puffins are bringing native fish to their chicks that rebounded with strict US federal rules, such as haddock and Acadian redfish. Negatively, the last decade has also brought the warmest waters ever recorded in the Gulf of Maine. A study published in April by researchers from the National Oceanic and Atmospheric Administration (NOAA) predicts that several current key species of fish “may not remain in these waters under continued warming.” Last month, researchers from the University of Maine, the Gulf of Maine Research Institute, NOAA, and others published a study in the journal Elementa, finding that the longer summers in the Gulf of Maine may have major implications for everything from marine life below the surface to fueling hurricanes in the sky.

For puffins, there already is significant evidence that in the warmest years, the puffin’s preferred cold-water prey like herring and hake are forced farther out to sea while some of the fish that come up from the mid-Atlantic, such as butterfish, are too big and oval for small puffin chicks to eat. The new fish volatility is such that while puffins thrived last year on tiny Eastern Egg Rock, their counterparts could not find fish off the biggest puffin island in the Gulf of Maine, Canadian-administered Machias Seal Island. Last year saw a record-low near-total breeding failure among its 5,500 pairs of puffins.

The Atlantic puffin, from Maine to the United Kingdom, has rapidly become a signal bird for climate change via the fish the parents attempt to bring to chicks. The Gulf of Maine is one of the fastest warming waters in the world and as a result, more puffins are bringing in more southerly species such as butterfish, such as the one pictured here. Butterfish are too large and oval for chicks to eat, leading to starvation. Photo: Derrick Jackson

In the European part of the Atlantic puffin’s range, warmer water displacing prey, overfishing, and pollution have hammered breeding success. According to an article this year in the journal Conservation Letters, co-authored by Andy Rosenberg, the director for the Center for Science and Democracy at the Union of Concerned Scientists and a former regional fisheries director for the National Oceanographic and Atmospheric Administration, the north Atlantic realm of the puffin is one of the most over-exploited fisheries in the world, as evident by the crash of several fisheries, most notably cod.

On the Norwegian island of Rost for instance, the 1.5 million breeding pairs of puffins of four decades ago were down to 289,000 in 2015. A key reason appears to be voracious mackerel moving northward, gobbling up the puffin’s herring. Even though there are an estimated 9.5 million to 11.6 million puffins on the other side of the Atlantic for now, Bird Life International two years ago raised the extinction threat for puffins from “least concern” to “vulnerable.”

Much of that was on the minds of the Egg Rock interns, because the very puffins they were counting are in the crosshairs of American politics.

Incessant attacks on environmental accomplishments

Puffins are on land only four months to breed so Kress and his team a few years ago put geo-locators on some birds to see where they migrate in the eight months at sea. Two years ago, the team announced that in the fall and early winter, many Maine puffins go north to the mouth of the St. Lawrence River. In late winter and early spring, they come south to forage in fish-rich deep water far south of Cape Cod. That area of ocean is so relatively untouched by human plunder, the corals in the deep are as colorful as any in a Caribbean reef.

The Obama administration was impressed enough to designate the area as the Northeast Canyons and Seamounts National Marine Monument, protected from commercial exploitation. While vast areas of the Pacific Ocean under US jurisdiction earned monument status under Presidents Obama and George W. Bush, the canyons are the first US waters in the Atlantic to be so protected.

Yet President Trump, as part of his incessant attack on his predecessor’s environmental accomplishments, ordered Interior Secretary Ryan Zinke to review Obama’s monument designations for possible reversal. Even though the Coral Canyons account for a tiny fraction of New England’s heavily depleted waters, the fishing lobby bitterly opposed monument status. This week, the Washington Post reported that Zinke has recommended that the Canyons and Seamounts be opened to commercial fishing.

The researchers on Egg Rock mused around the fire over the concerted attempt, led by the Republican Party and often aided by Democrats in top fossil-fuel production states, to roll back environmental protections for everything from coral to coal ash and broadly discredit science in everything from seabird protections to renewable energy. Some of the divisions of NOAA that are directly involved in studying waters like the Gulf of Maine are targeted for massive budget cuts by the Trump administration.

Maine’s puffins are direct beneficiaries of strict federal fishing management since the 1970s. In recent years, puffins have supplemented their traditional diet of herring and hake with species that have rebounded in the Gulf of Maine, such as the haddock pictured here. Photo: Derrick Jackson

Fighting against a stacked deck

“It’s funny how in the business world and the stock market, no one questions the numbers and facts,” said Brazier, who marched in April’s March for Science in Washington, DC. “They’re taken as facts and then people use them to decide what to do. But now it’s ok to question science.”

“I think it’s because if you can deny science, you can deny what needs to be done,” Eby said. “It’s too hard for a lot of people in rich countries to get their heads around the fact is that if we’re going to deal with climate change, we’re going to have to change the way we live and the way we use energy. That’s so hard, a lot of people would rather find ways to skip the science and live in their world without thinking about the consequences.”

Tutterow, who hails from North Carolina, where the General Assembly in 2012 famously banned state use of a 100-year-projection of a 39-inch sea-level rise, added, “If I was offered a state or federal job, I’d take it. I’d like to believe there’s a lot of career professionals who work hard to get the job done. But it used to be the main thing you worried about was red tape. Now you have to worry about censorship.”

Dickey said simply, “Sometimes it feels like the deck is stacked against us. But we just have to keep working as hard as we can until someone realizes we’re just trying to deliver facts to help the world.”

Puffins in Maine breed in burrows that wind crazily underneath boulders that rim their islands. That tests the ability of interns to reach for chicks to band for future study. Photo: Derrick Jackson

Joyful doggedness

The stacked deck is unfair, given the joyful doggedness displayed by this crew. On two days, I followed them around the perimeters of Egg Rock as they wrenched their bodies to “grub” under the boulders, contorting until they could reach their arm into the darkness to puffin chicks to band for research.

The simple act of banding has led to understanding the puffin’s extremely high levels of fidelity, coming back to the same island and burrow year after year despite migrating hundreds of miles away. One Project Puffin bird was in the running for the oldest-known puffin in the world, making it to 35 before disappearing in 2013. A Norwegian puffin made it 41 before being found dead.

On other Atlantic puffin islands, the birds can nest in more shallow cavities of rocks and mounds in grassy cliffs within wrist and elbow reach. Researchers on those islands are able to band scores of puffin chicks and adults.

But the massive size of jagged boulders on Eastern Egg Rock makes it so difficult to grub that the summer record was only 14. On my visit, the crew went from 9 to 17 chicks, with Brazier constantly saying, “Oh no, we’re not giving up. We got this. The next crew’s going to have work hard to beat us.”

No face was brighter than Aztorga-Ornelas’ when she took an adult puffin they banded and lowered it between her legs like a basketball player making an underhanded free throw. She lifted up the bird and let it go to fly back to the ocean to get more fish for its chicks. “I’ll never forget that for the rest of my life,” she said.

On another day, with the same enthusiasm displayed for puffins, they grubbed for another member of the auk family, the black guillemot. At one point, they caught four chicks in separate burrows within seconds of each other. They gleefully posed with birds for photographs.

“I wish people could feel why I’m in this,” Tutterow said. She talked about a prior wolf study project in Minnesota. “We tracked in the snow what we thought was one wolf,” she said. “Then, at a junction, what we thought was one single wolf, the tracks split into five different sets of tracks. Your jaw drops at the ability of these animals to perfectly follow each other to disguise the pack.”

Eastern Egg Rock went from the 1880s to 1977 with no resident puffins. This year, the number of breeding pairs hit a record 173. Where there were two or four birds left in the entire state of Maine in 1902, there are 1,300 pair today. Photo: Derrick Jackson

Getting it right

My jaw dropped at how bird science is making world travelers out of this crew beyond Egg Rock. Brazier has worked with African penguins in South Africa, loggerhead turtles in Greece, snowshoe hares in the Yukon, and this fall is headed to Midway Atoll for habitat restoration in key grounds for albatross.

Eby has worked with foxes in Churchill, Manitoba; oystercatchers, murres, auklets, gulls, and petrels in Alaska; and ducks in Nebraska. Besides wolves, Tutterow has helped manage tropicbirds and shearwaters in the Bahamas, honeybees and freshwater fish in North Carolina, loons in the Adirondacks, and wolves in Minnesota. Aztorga-Ornelas has worked with oystercatchers and auklets on Mexican islands and Dickey has helped restore bobwhite, quail, deer, and wild turkey habitat in Texas.

Brazier said a huge reason she helped rehabilitate injured endangered African penguins in South Africa was because of her experience tending to them in college at the Maryland Zoo. “I actually didn’t get the section of the zoo I applied for,” she said. “I got the African penguin exhibit and when all these little fellas were running around my feet, it was the best day of my life.”

Though he is the youngest of the crew, Dickey said his quail and bobwhite work gave him self-sufficiency beyond his years. “My boss lived two miles away and my tractor had a flat four times. It was on me to fix it and I figured it out, even though it was hotter than hell every day, sometimes 110.”

Tutterow, the oldest, originally earned a bachelors degree in nursing at Appalachian State University, but found far more satisfaction in an outdoor career. Among her fondest childhood memories was her parents allowing her to wander in local woods to read spot on a rock on a creek. “You can build a lifestyle around any amount of income, but you cannot build happiness into every lifestyle,” she said. “Working with these animals, I’m building happiness for them and me.”

No myopic set of politics and denial of science should ever get in the way of this level of career happiness. Aztorga-Ornelas and I, despite her limited English and my stunted Spanish, shared a dozen “Wows!” sitting together in a bird blind, watching puffins zoom ashore with fish.

Eby said, “It’s strange for me. We just came out of a conservative government in Canada (under former Prime Minister Stephen Harper) where they stopped lake research for acid rain, fisheries, and climate change and government scientists did not feel the freedom to speak out. And now that we’re getting more freedom, I’m here. I hope the US can get it right soon.”

 

What’s My State Doing About Solar and Wind? New Rainbow Graphic Lets You Know

[With costs dropping and scale climbing, wind and solar have been going great guns in recent years. Shannon Wojcik, one of the Stanford University Schneider Sustainable Energy Fellows we’ve been lucky enough to have had with us this summer, worked to capture that movement for your state and its 49 partners. Here’s Shannon’s graphic, and her thoughts about it.]

Do you ever wonder how much energy those rooftop solar panels in your state are contributing to renewable energy in our country? How about the wind turbines you see off the highway?

Our new “rainbow mountain” graphic lets you see your state’s piece of solar and wind’s quickly growing contribution to the US electricity mix. It shows how much of our electricity has come from wind and solar each month for the last 16 years. Just click on your state in the graph’s legend or roll your mouse over the graphic to see what’s been happening where you live.

Dashboard 1

var divElement = document.getElementById('viz1506006933250'); var vizElement = divElement.getElementsByTagName('object')[0]; vizElement.style.width='870px';vizElement.style.height='619px'; var scriptElement = document.createElement('script'); scriptElement.src = 'https://public.tableau.com/javascripts/api/viz_v1.js'; vizElement.parentNode.insertBefore(scriptElement, vizElement);

At first glance, this graphic looks like a disorderly rainbow mountain range. Keep staring though (try not to be mesmerized by the colors) and you can start to see patterns.

The peaks in the mountain range seem to be methodical, as well as the dips. The peaks where the most electricity is supplied by wind and solar can be seen in spring, where demand (the denominator) is lower due to moderate temperatures, and generation (the numerator) is high due to windy and sunny days. The crevasses, in July and August, happen because demand for electricity is high at those times thanks to air conditioning, increasing the overall load on the US grid—and driving up our calculation’s denominator. If you were to look just at monthly generation of wind and solar, this variation would be smaller.

Another, much more obvious thing about the mountains is that they’re getting taller. In fact, we passed a notable milestone in March of 2017, when, for the first time, wind and solar supplied 10% of the entire US electricity demand over the month. In 2012, solar and wind had only reached 4.6% of total US generation, so the recent peak meant more than a doubling in just 5 years.

That’s momentum.

Climbers and crawlers

Being able to see the different states lets you see where the action is on wind and solar—which are the climbers and which are the crawlers.

You know the saying about how everything is bigger in Texas? Well, that certainly holds true here. Texas is the bedrock of this mountain range, never supplying less than 14% of the wind and solar for the entire US after 2001. Even supplying as much as 35% some months. Texas hosts the largest wind generation, and doesn’t seem to be in danger of losing that title anytime soon.

California is another crucial state in this mountain range, and has been from the beginning. California was building solar and wind farms years before the other states, a trendsetter; in 2001, it was supplying up to 75% of all the wind and solar electricity in the US. California is still the second largest supplier of wind and solar.

Other notable states that are building this solar and wind mountain are Oklahoma, Iowa, Kansas, Illinois, Minnesota, Colorado, North Dakota, Arizona, and North Carolina. Most of these states are rising up due to wind, but Arizona and North Carolina, along with California, are leading with solar.

Not all states with strong solar and wind performances by some metrics show up here. South Dakota is #2 for wind as a fraction of their own generation, though on this graphic it’s barely visible.

What does this mean?

This graphic shows that the momentum of solar and wind growth in the United States is undeniable. It can be seen on rooftops, in windy valleys and on windy plains, and even in states where coal has been king. All 50 states are involved as well, as every state generates electricity with wind and solar.

There are many ways for your state to increase its overall percentage. It can either decrease its denominator with energy efficiency or increase its numerator with wind and solar installations.

Not satisfied with where your state shows up on this graph? Check out what more your state can do.

Free Lunches in New York City Public Schools Are a Win for Kids—and Technology

Photo: USDA

It’s so good to share good news.

This month, the New York City Public Schools announced that, starting with the current school year, all students can receive free lunch with no questions asked. That means less stigma for kids facing food insecurity, less worrying for families, and less paperwork for school districts. And it might surprise you to learn that at the heart of this victory—carried across the finish line by a group of dedicated advocates—is a fairly common application of technology.

The underlying policy at play here is called the “Community Eligibility Provision,” or CEP. It was authorized with the Healthy, Hunger-Free Kids Act of 2010 to help schools and local educational agencies with a high percentage of low-income students. As a colleague wrote on this blog in 2016, CEP helps school systems (like New York City Public Schools) to reduce paperwork and poverty stigma, while making sure that free and reduced price meals are available to all kids who might need them. Instead of asking each family to fill out an application, CEP allows schools to determine student eligibility through household participation in programs like SNAP (the Supplemental Nutrition Assistance Program, commonly referred to as food stamps) and TANF (the Temporary Assistance for needy Families program). If over 40 percent of students are deemed eligible, schools receive additional federal reimbursement dollars to cover free meals for more students beyond those who qualify—ensuring that even those whose families are not enrolled in federal assistance programs can still get meals if they need them. 

So how is New York City able to cover free meals for all students?

Here’s the math answer: the CEP multiplier is 1.6, which means that if 50 percent of students at School X are eligible for free meals, School X can actually serve free meals to (50 percent) * (1.6) = 80 percent of students using federal reimbursement dollars. If New York City Public Schools are now receiving federal reimbursement for 100 percent of students, it would mean they have demonstrated that at least (100 percent) / (1.6) = 62.5 percent of students are eligible through CEP.

Which brings us to the real-world answer: New York is able to cover free meals for all students because it got smart about its use of technology to better reflect true student need. The New York Department of Education website describes the new data matching engine it has developed to identify eligible students:

“This new matching system provides a more efficient and accurate process for matching students across a range of forms that families already complete. This new matching process yielded an increase in the number of students directly certified – or matched to another government program – and increased the direct certification rate, allowing the City to qualify for the highest level of reimbursement in the federal CEP program. The number of families living in poverty has not increased; the changes to the matching process allow the City to better identify families.”

Why the technology matters

I know what you’re thinking. It’s awesome that all kids in New York City Public Schools can eat for free! But why make such a big deal about this technology? It doesn’t seem like rocket science.

Bingo.

New York City Public Schools is not using a particle accelerator to improve data matching among students. They haven’t even used a 3-D printer. The data integration and management systems they’re employing, while complex, are actually fairly commonplace. It’s the same sort of technology banks use to combine different databases of credit scores and application information to make credit offers, which is the same technology Netflix uses to deduce that because you watched Good Burger, you might like Cool Runnings. (Hypothetically speaking.)

Yet when it comes to the use of technology in the administration of nutrition assistance programs, we have fallen remarkably behind. The transition from actual paper food stamps to electronic benefit cards officially concluded in 2004, nearly fifty years after the introduction of the first major credit card. Even now, some states (looking at you, Wyoming!) require SNAP applications to be faxed, mailed, or returned in person.

To be clear, I’m not claiming technology is a silver bullet. For one, implementing new technology often comes with a price tag—and a steep learning curve. (Just ask Kentucky.) In particular, the use of data matching raises ethical concerns related to privacy and security, and these are not to be overlooked. But in many cases, these are arguments to improve, rather than disregard, the technology and the policies that guide its use. Because when our public assistance programs fall behind, so do the people who rely on them, and so does our ability to deliver maximum public benefit with increasingly limited resources. It is critical (and just plain sensible) to use the tools at our disposal to help realize the potential of current technological systems to enhance the strength and efficiency of the federal safety net. 

Carrying the momentum in the 2018 farm bill

Keep an eye on this issue. There is reason to suspect that the advancement of technology in public assistance programs will be addressed in the 2018 farm bill, and even reason to hope for a bipartisan effort. In fact, I’ll take the opportunity to quote Glenn Thompson, chairman of the House Agriculture Nutrition Subcommittee, who opened a June hearing on SNAP technology and modernization with this sentiment: “We need to get the policy right. As we approach the upcoming farm bill, it is critical we understand opportunities to amend and improve the program to properly account for the changes that come with our evolving, technological world.”

Bringing Down the House: A Hostile Takeover of Science-Based Policymaking by Trump Appointees

The Trump administration is slowly filling positions below the cabinet officer level in the “mission agencies” of the federal government (e.g., EPA, NOAA , Interior, DOE, etc. whose job it is to implement a specific set of statutory mandates). The appointed individuals are leading day-to-day decision-making on policies from public health and safety to environmental protection to critical applied science programs. In other words, the decisions these appointees make will affect everyone in the country.

The job of the agencies and their political leadership is to represent the public interest. It is not to serve the private interests of particular industries and companies, or even to push political viewpoints, but to implement legislative mandates in the interest of the American public. After all, who else but government can do this? Our laws call for the water and air to be clean, our workers and communities to be safe, our environment to be healthy and our science to be robust and fundamental to better policy and decision-making. That is what mission agencies are tasked to do.

So, what have we seen so far? To be sure, the Administration has nominated and appointed some qualified individuals with good experience and little apparent conflicts of interest. But unfortunately, that is not the norm. In my mind, most of the key appointments with responsibility for science-based policymaking fall into three categories:

  • The conflicted: Individuals who have spent a significant part of their careers lobbying the agencies they are now appointed to lead to obtain more favorable policies to benefit specific industries or companies—and who will likely do so again once they leave the government. These individuals have a conflict of interest because of these connections. Despite President Trump’s call to “drain the swamp,” these appointees are well-adapted and key species in that very swamp (sorry, my ecologist background showing through).
  • The opposed: Individuals who have spent much of their careers arguing against the very mission of the agencies they now lead. This group is not entirely separate from the first, because often they made those arguments on behalf of corporate clients pushing for less accountability to or oversight from the American public. But further, they have opposed the very role played by the federal agencies they are appointed to serve. While they may have conflicts of interest as in (1), they also have an expressed anti-agency agenda that strongly suggests they will work to undermine the agency’s mission.
  • The unqualified: Individuals who are wholly unqualified because they haven’t the experience or training or credentials that are requisite for the job. Again, these appointees may also have conflicts of interest, and apposite political agendas to the missions of the agencies, but they also have no real place leading a complex organization that requires specific expertise.

With more than 4,000 possible political appointments to federal agencies, I of course cannot cover them all. In fact, scanning through the list of those 600 appointments requiring Senate confirmation, less than one-third have even been nominated for Senate action. But here is a disturbing set of nominees or appointments that undermine science-based policymaking.

The conflicted

William Wehrum is a lawyer and lobbyist nominated to lead the EPA Office of Air and Radiation (OAR). He previously worked at EPA during the G.W. Bush Administration. UCS opposed his nomination then. Mr. Wehrum’s corporate clients include Koch Industries, the American Fuel and Petrochemical Manufacturers, and others in the auto and petrochemical industries. He has been a vocal spokesperson against addressing climate change under the Clean Air Act, which would be part of his responsibility as OAR director. While he has advocated for devolving more authority to the states for addressing air pollution generally, he also opposed granting California a waiver under the Clean Air Act to regulate greenhouse gas emissions from vehicles. Mr. Wehrum has also been directly involved, both as a lobbyist for industry and during his previous stint at EPA, in efforts to subvert the science concerning mercury pollution from power plants, restrictions on industrial emissions, as well as lead, soot and regional haze regulations.

Dr. Michael Dourson has been nominated to be EPA Assistant Administrator for Chemical Safety and Pollution Prevention. He is well known by the chemical industry, having spent years working as a toxicologist for hire for industries from tobacco to pesticides and other chemicals. Dr. Dourson has argued that the pesticide chlorpyrifos is safe despite a large body of science to the contrary. He has advocated for the continued use of a toxic industrial chemical called TCE, which the EPA determined was carcinogenic to humans by all routes of exposure. [TCE was the chemical linked to leukemia in children in the 1998 film “A Civil Action.”] When asked about his controversial chemical risk assessment company, TERA, receiving funding from chemical companies, Dourson responded: “Jesus hung out with prostitutes and tax collectors. He had dinner with them.”

Dr. Nancy Beck, appointed to the position of EPA Deputy Assistant Administrator, now leads the agency’s effort to implement the Lautenberg Chemical Safety Act, which was signed into law last year. Dr. Beck was previously senior staff with the American Chemistry Council, the trade organization that worked very hard for years to weaken the rules protecting the public from toxic chemicals. The result? The new rules from the EPA are far weaker than those developed by the professional staff at the agency and remarkably similar to the position the industry favored, while dismissing the positions of other members of the public and other organizations including UCS. Previously, Dr. Beck worked in the G.W. Bush Administration at the Office of Management and Budget. During that part of her career Dr. Beck was called out by the U.S. House Science and Technology Committee for attempting to undermine EPA’s assessment of toxic chemicals and her draft guidance on chemical safety evaluations was called “fundamentally flawed” by the National Academy of Sciences.

Lest you think that the conflicted are all at EPA, consider David Zatezalo, nominated to be Assistant Secretary of Labor for Mine Health and Safety. He was formerly the chairman of Rhino Resources, a Kentucky coal company that was recipient of two letters from the Mine Safety and Health Administration for patterns of violations. Subsequently a miner was killed when a wall collapsed. The company was fined.

David Bernhardt has been confirmed as the Deputy Secretary of Interior. He was DOI Solicitor under the George W. Bush administration. In 2008, weeks before leaving office, Bernhardt shifted controversial political appointees who had ignored or suppressed science into senior civil service posts. While at his law firm Brownstein Hyatt Farber Schreck, he represented energy and mining interests and lobbied for California’s Westlands Water District. His position in the firm—he was a partner—and the firm’s financial relationship with Cadiz Inc. (which is involved in a controversial plan to pump groundwater in the Mojave desert and sell it in southern California) has led to one group calling him a “walking conflict of interest.” Bernhardt also represented Alaska in its failed 2014 suit to force the Interior department to allow exploratory drilling at the Arctic National Wildlife Refuge.

The opposed

Susan Combs has been nominated to be the Assistant Secretary of Interior for Policy, Management, and Budget. She was previously Texas’s agricultural commissioner and then the state’s Comptroller where she often fought with the U.S. Fish and Wildlife Service over Endangered Species Act issues. Notably she has a history of meddling in science-based policy issues like species protections. She has been deeply engaged in battling for property rights and against public interest protections; she once called proposed Endangered Species Act listings as “incoming Scud missiles” against the Texas economy. Of course, protecting endangered species, biodiversity and public lands is a major responsibility of the Department of Interior.

Daniel Simmons has been nominated to be the Principal Deputy Assistant Secretary of the Office of Energy Efficiency to foster development of renewable and energy-efficient technologies. He was previously Vice President at the Institute for Energy Research, a conservative organization that promotes fossil fuel use, opposed the Paris Climate Accord, and opposes support for renewable energy sources such as wind and solar. He also worked for the American Legislative Exchange Council (ALEC) as director for their natural resources task force. ALEC is widely known for advocating against energy efficiency measures.

The unqualified

Sam Clovis, the nominee for Undersecretary of Agriculture for Research, Education and Economics, effectively the department’s chief scientists, is not a scientist or an economist nor does he have expertise in any scientific discipline relevant to his proposed position at USDA—like food science, nutrition, weed science, agronomy, entomology. Despite this lack of qualifications, he does deny the evidence of a changing climate. He was a talk radio host with a horrendous record of racist, homophobic and other bigoted views which should be disqualifying in themselves.

Albert Kelly has been appointed a senior advisor to EPA Administrator Scott Pruitt and the Chair of the Superfund Task Force. He is an Oklahoma banker with no experience with Superfund or environmental issues, but he was a major donor to Mr. Pruitt’s political campaigns. So far the task force has focused on “increasing efficiencies” in the Superfund program.

Over at NASA, the nominee for Administrator is Rep. James Bridenstine, (R. OK). While he certainly has government and public policy experience (a plus), he does not have a science background, a management background or experience with the space program. He has called aggressively for NASA to focus on space exploration and returning to the moon, rather than its earth science mission. In addition, he has been a strong advocate for privatization of some of the work of the agency. He has questioned the science on climate change and accused the Obama Administration of “gross misallocation of funds” for spending on climate research.

Michael Kratsios is the Deputy Chief Technology Officer and de facto head of Office of Science and Technology Policy in the White House. He is a former aide to Silicon Valley executive Peter Thiel and holds a AB in politics from Princeton with a focus on Hellenic Studies. He previously worked in investment banking and with a hedge fund. How this experience qualifies him to be deputy chief technology officer is beyond me.

Can we have science-based policies?

This is by no means a full list of egregious nominees for positions that will have a big impact on our daily lives. So, the question remains, is science-based policy making a thing of the past? Will the conflicted, the opposed, and the unqualified be the pattern for the future?

Fortunately, we can and should fight back. We as scientists, concerned members of the public, and activists can call on our elected officials to oppose these nominees. If they are in place, then they can be held to account by Congress, the courts, and yes, in the court of public opinion. Handing over the fundamental job of protecting the public to champions for regulated industries and political ideologues is wrong for all of us. After all, if industry did protect the public from public health or environmental impacts, then regulatory controls would be superfluous.

We can’t just wring our hands and wish things didn’t go this way. Conflicted, opposed and unqualified they may be, but they are now in public service. Let’s hold them to account.

How Freight Impacts Communities Across California

Photo: Luis Castilla

Today, UCS and the California Cleaner Freight Coalition (CCFC) released a video highlighting the impacts of freight across California. This video – and longer cuts of individual interviews here – touch on the many communities across California affected by freight.

Freight is a big industry in California. Nearly 40 percent of cargo containers entering and leaving the United States pass through California ports. California is also the largest agricultural producing state, supplying nearly one fifth the country’s dairy, one third of the country’s vegetables, and two-thirds of the country’s fruits and nuts.

Truck traffic on I-5 heading north towards the Central Valley near Castaic, CA.

Farm in Shafter, CA.

This means California is home to many ports, rail yards, warehouses, distribution centers, farms, and dairies – all of which are serviced by many trucks. Despite the latest (2010) engine standards and significant financial investments by the state and local air districts, air quality in California remains among the worst in the United States, due in large part to truck emissions.

The most polluted cities in the United States. Source: American Lung Association, State of the Air 2016.

Communities impacted by freight are often burdened by other sources of pollution

In the Central Valley, a trash incinerator is opposed by community groups yet classified by the state as a source of renewable energy. Biomass power plants emit significant amounts of particulate matter. Oil drilling operations contribute to both air pollution and unknown water contamination.

Dairies in the Valley contribute not only to methane emissions, but also to other health hazards including particulate matter (from reactions of ammonia in excrement with nitrogen oxides (NOx) from cars and trucks), smog/ozone (from reactions of NOx with volatile organic compounds produced by decomposing animal feed), and contamination of aquifers. Just like real estate prices drove dairies from the Inland Empire to the Central Valley, warehouses and distribution centers are following suit despite being 150 miles from the Ports of Los Angeles and Long Beach.

Silage (animal feed) pile near Shafter, CA.

Two views of a large Ross Distribution Center in Shafter, CA (measures over 1 mile around the building and 2 miles around the entire lot).

In the Los Angeles region, not only are roadways and the two ports major concerns for communities, but so are oil refineries and over 1,000 active oil drilling sites.

Most of these urban oil sites are within a few football fields of homes, schools, churches, and hospitals. Despite all of the “green” accolades bestowed on California, it is the 3rd largest oil producer in the United States after Texas and North Dakota.

Pumpjacks in California can be found next to farms, hospitals, and even In-N-Out.

So what’s the solution?

For trucks, we need stronger engine standards for combustion vehicles, commitments to and incentives for zero-emission vehicles, and roll-out of battery charging stations and hydrogen fueling stations with electricity and hydrogen from renewable energy.

Just last week, the California legislature passed bills (1) to get zero-emission trucks integrated to fleets owned by the state and (2) allocating $895 million from cap and trade revenue for cleaner heavy-duty vehicles. The California Cleaner Freight Coalition is working on a range of solutions from the state to local level and UCS is proud to be a member of this coalition. Watch and share the video!

Photo: Luis Castilla Photo: Jimmy O'Dea Photo: Jimmy O'Dea Photo: Jimmy O'Dea Photos: Jimmy O'Dea Photos: Jimmy O'Dea

Pages