UCS Blog - The Equation (text only)

Proposed Legislation Would Prevent Census Tampering

In anticipation of this week’s announced plans for the Commerce Department to add a citizenship question to the regular form on the 2020 Census, a group of legislators have proposed a bill designed to protect the scientific integrity of the Census from late additions.

“The 2020 Census Improving Data and Enhanced Accuracy (IDEA) Act has unfortunately become a necessary safeguard against this administration’s clear desire to politicize and compromise the 2020 Census,” said Rep. Maloney. “This bill will protect the integrity of the census, our nation’s largest peacetime undertaking, by making sure that topics and questions included in the census are properly vetted and not added at the last minute—endangering the accuracy of the census, response rates, and cost to the taxpayer.”

The 2020 Census IDEA Act would:

  • Prohibit last-minute changes or additions to the census without proper research, studying, and testing;
  • Ensure that subjects, types of information, and questions that have not been submitted to Congress according to existing law are not included;
  • Require biannual reports on the U.S. Census Bureau’s operation plan, including the status of its research and testing, and require that this report be publicly available on the Bureau’s website;
  • Direct the U.S. Government Accountability Office to determine and report to Congress that the subjects, types of information, and questions on the decennial census have been researched, studied, and tested to the same degree as previous decennial census; and
  • Apply the provisions of this bill only to the decennial census, and not the mid-decade census or the American Community Survey.

This is not just a concern for demographers. Businesses rely on regional information about the age, income, education, family structure, occupations and commuting patterns of people that determine market segmentations. This sort of behavioral data, which is only captured once every 10 years, provides the best picture that we have about how American consumers live.

Former Census Director Dr. Kenneth Prewitt attended the announcement of the IDEA Act. Prewitt stated that he could not offer “any scientific reason to add a question at this late time to the census form, but I can assure you that this untested, last minute change will introduce great risk to the accuracy and cost to the people’s census.” He added that Thomas Jefferson, who, along with James Madison, initiated the first census, would be “turning over in his grave.” Scientists and citizens alike need to stay vigilant about the integrity of the Census, which is, after all, the story of who we were, who we are, and who we are becoming as a people.

Why the Senate Should Reject Pompeo as Secretary of State

After ousting Rex Tillerson as secretary of state, Donald Trump has decided to replace him with Mike Pompeo, the current CIA director. Pompeo’s views on Iran and North Korea, and more generally his lack of diplomatic experience, make him a terrible choice for secretary of state—especially given the international challenges the United States is now facing.

Checking Trump’s impulses

Source: CIA

As former CIA Director Michael Hayden noted, “Secretary Tillerson was a counterweight to some of the instantaneous, spontaneous, instinctive decisions that the president was prone to make. And I think we’re going to miss the counterweight.”

A key concern is Trump’s impulses on international affairs. He has focused heavily on military power and has shown a clear disregard for international agreements and for the importance of maintaining close relations with US allies.

Tillerson was a strong voice against pulling out of the Paris climate accords and the Iran nuclear deal. Tillerson also argued that it was important to maintain US credibility as a negotiating partner, both regarding past agreements and the possibility of future negotiations with North Korea.

He and others, including the general who commands US forces in the Middle East and Central Asia, argued that the Iran deal is not perfect—it was the result of a negotiation, after all—but the United States is better off with the deal than without it.

In contrast, Pompeo’s worldview is very similar to Trump’s, which may embolden Trump to act on his impulses.

This issue is even more important now that John Bolton is to become Trump’s national security advisor, since he seems likely to encourage some of Trump’s most dangerous impulses.

The future of the Iran Deal

 Pompeo has echoed Trump’s view of the Iran nuclear deal, calling it “disastrous” and saying he wants to see it ended.

And that could happen soon. Trump has said he will reimpose economic sanctions on Iran in mid-May if the other partners to the accord don’t agree to take steps to “fix the terrible flaws” of the deal. That seems unlikely to happen.

Pompeo’s disdain for the deal, and for diplomacy more generally, is clear. As a congressman, he was one of a group of people—including John Bolton—who advocated stopping the negotiations with Iran and instead bombing its nuclear facilities. In a 2014 meeting with reporters, he told them this could be done with “under 2,000 sorties” and that “This is not an insurmountable task for the coalition forces.”

Scuttling the deal would be a disaster. It would end the current strict limits and intrusive verification on Iran’s nuclear capabilities. It would also sow discord with a number of our close allies, who remain committed to the deal, and likely enflame anti-US sentiment in the region. It would also undermine US credibility on future negotiations.

Negotiations with North Korea

The credibility issue is important as the United States moves toward talks with North Korea about its nuclear and missile programs.

North Korean leader Kim Jong-Un recently offered to meet with Trump, who jumped at the opportunity. This is significant since Pyongyang said it is willing to talk about denuclearization, which is a long-standing US pre-condition for talking.

Moreover, the North said it will halt nuclear and missile tests while talks continue. Since the testing freeze can be readily verified, this is an important step: It means the freeze verifiably stops testing and gives ongoing evidence that North Korea is serious about the talks. This sets about as good a stage as one can imagine for talks that could lead to meaningful changes in North Korean nuclear and missile programs.

But how much will Pyongyang be willing to put on the negotiating table if it sees the United States walk away from the Iran deal despite international inspectors confirming that Iran is carrying out its side of the bargain?

And once talks start, will the US approach be negotiation or confrontation? Who at a high level in the administration is supporting diplomacy?

Pompeo advocates regime change in North Korea. As with Iran, his statements on North Korea seem to support military action against the country—something Bolton argued in favor of as recently as last month.

The administration has ramped up international sanctions against the country. Is it willing to negotiate an easing of sanctions for steps that lower hostility between the two countries and pave the way toward further steps? Or will it demand the North “denuclearize” before it is willing to reward its behavior? If the US does not negotiate seriously, but instead uses the talks as a forum to castigate Pyongyang for bad behavior, it will throw away an important opportunity and reignite hostilities.

The wrong choice

The United States needs a secretary of state who is a strong supporter of diplomacy as a means of improving US security. Mike Pompeo is not that person.

The Senate should reject his nomination and insist that the president choose someone who respects the benefits of diplomacy, which is a vital component of US security.

Taking Action for Public Science: Re-Imagining Iowa’s Leopold Center for Sustainable Agriculture

On a snowy February morning at the Iowa state capitol in Des Moines, students, farmers, community members, scientists, food system employees, and advocates gathered for a press conference and advocacy day. Their efforts came almost one year to the day after the state legislature voted to defund and shut down the Leopold Center, for 30 years the state’s pre-eminent institution for research, learning and practice on sustainable agriculture. Constituents from across the state and beyond had responded with grassroots organizing to reframe discussions about public agricultural science in Iowa. And now they were calling for a re-imagined Leopold Center to lead a bold new vision for Iowa’s agricultural future:

“Supporting a socially just, environmentally sound agricultural system goes beyond simply providing food, fiber, and fuels—it means revitalizing rural communities, and turning Iowa into a shining example of how a resilient, locally focused agricultural system can make a large difference in individual communities and throughout the world.”

—Kristine Neu, Iowa State University graduate student

Lawmakers founded the Leopold Center at Iowa State University through Iowa’s 1987 Groundwater Protection Act and in so doing created an institution that benefited farmers, students and community members through research and educational programs. Yet, in the spring of 2017, the state legislature voted to defund and shut down the Leopold Center.

Sustainable agriculture scientists and advocates sprang to action immediately, writing a petition decrying the cuts, garnering national attention and more than 600 signatures over the first weekend it was available. Alumni and allies drafted memos and collected data for reports to share with legislators, wrote press releases and editorials, and organized turn-out to the state budget hearing.

This grassroots advocacy succeeded in securing a veto that saved the Leopold Center in name only—its funding was redirected to a research center created in 2014 dedicated to “nutrient management.” State legislators claimed the Leopold Center’s work was “accomplished.” The public mourned its loss, and stories in the press read as eulogies rather than rallies for its rebuilding. But we saw an opportunity to push forward a new vision for Iowa’s agricultural future—one of regeneration and healing (Carter, Chennault, and Kruzic 2018).

Iowa’s agricultural history is one of extraction. Iowa State University sits on land occupied by white settlers following the Black Hawk “Purchase” of 1833, a forceful removal of the Sauk and Meskwaki people following the Black Hawk War. The extractive economy continues today, with Iowa second only to California in the value of agricultural goods and boasting more hogs (22.4 million) and chickens (60 million) than people (3 million). This production system comes at a cost to the health of Iowa’s soil, water, and human communities as the state is literally washing away at the rate of 20 tons of soil per acre and more each year, and nitrate loading from agricultural landscapes pollutes the drinking water (Naidenko, Cox and Bruzelius 2012; Rundquist and Cox 2018). Clearly, the Leopold Center’s work is far from over.

Science for Public Good grant from the Union of Concerned Scientists helped us create an advocacy video communicating our collective’s new vision for the Leopold Center and agriculture in Iowa. In partnership with farmers, students, emeritus faculty, community leaders, and members of the Iowa Farmers Union, Women, Food and Agriculture Network, Center for Rural Affairs, Practical Farmers of Iowa, Lutheran Services of Iowa, and Iowa State University Sustainable Agriculture Student Association, we brainstormed, debated, revised, and shared new visions. The collective vision shared from these efforts celebrates diversity and prioritizes care, which are necessary components of agrifood systems change in Iowa and beyond. We launched this vision through a series of op-eds at the Des Moines press conference in February 2018, and used it to rally supporters to attend the Leopold Center’s advisory board meeting in March 2018.

These are hard times for public science and scientists studying ecological and social changes. Our refusal to mourn and eulogize the Leopold Center’s loss—and our work to envision and work toward a boldly re-imagined agriculture in Iowa instead—reframed a debate while envisioning new paths forward. The Leopold Center’s future remains uncertain, yet we know the challenges our agrifood system faces will require the kind of collaboration, creativity, innovation, and transparency reflected in our collective vision. A re-imagined Leopold Center must transform what has become a monoculture of ideas with a polyculture of thought, experience, scientific approach, and innovative agricultural practices. A monoculture is weak and vulnerable; it fails to provide for the coming decades. We have adopted the prairie as our guide for the work ahead—deep roots, diverse, hardy through times of drought, and resilient through times of change.

Angie Carter is an environmental sociologist and assistant professor of environmental and energy justice at Michigan Technological University in Houghton, MI. She earned her PhD in Sustainable Agriculture and Sociology at Iowa State University. Twitter: @angielcarter

Ahna Kruzic is a community organizer turned communicator from rural southern Iowa. Ahna is Pesticide Action Network North America’s Communications Director and is based out of Berkeley, CA. Ahna is also a Food First / Institute for Food and Development Policy Fellow, and holds a Master of Science in Sustainable Agriculture and Sociology from Iowa State University. Twitter: @ahnakruzic

Carrie Chennault is a doctoral candidate in Sustainable Agriculture at Iowa State University, and a graduate research assistant with the Local Foods and SNAP-Education programs at ISU Extension & Outreach.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Experts Warn: Census is Under Attack

Photo: National Archives

Last night, US Commerce Secretary Wilbur Ross released a notification that the 2020 Census would include a citizenship question on the regular form, which goes to every household in the United States. This was prompted by a December request from the Attorney General Jeff Sessions and the Department of Justice, which they claimed was necessary for enforcing the Voting Rights Act.

Conducting the Decennial Census is literally one of the first responsibilities of our government. Article I, Section 2 of the U.S. Constitution requires that, for the purposes of apportioning electoral districts to the U.S. House of Representatives, “the whole number of persons for each state” be counted, including children, all foreign-born residents, and for much of the country’s history, slaves.

As soon as the request was made, several former Census Directors voiced their concern.  Steve Murdock, who directed the Census under George W. Bush, stated that he was “fearful that it would reduce participation of Hispanics, particularly those who are undocumented,” and the most recent Director, Kenneth Prewitt, similarly warned that the addition of the question could have “huge, unpredictable consequences,” given “an atmosphere of mistrust of government and the media, deep anxieties among immigrant groups and inadequate testing of Census Bureau procedures.”

The question has not been asked on the full Census since 1950. However, it is regularly asked on the American Community Survey, which samples the U.S. population and provides accurate aggregate data on citizenship. Concerns about non-response bias emerged early on when preliminary field tests from Census interviewers came back with reports that immigrant populations were exceedingly concerned about their safety and the confidentiality of the data. A significant undercount would have massive repercussions on the quality of data that commerce, policy advocates and the government relies on.

Just this week, a large number of the nation’s leading social scientists and users of Census data, joined by the Union of Concerned Scientists, sent a letter to Secretary Ross, explaining the importance of maintaining the scientific integrity of the nation’s most valuable data resource.

The response to last night’s announcement was swift and negative. Political science professor and director of the US Elections Project Michael McDonald put it bluntly: “The decision to add a citizenship question is obviously about political power. Trump is counting on noncitizens to refuse to answer the census, thus affecting apportionment of congressional seats to states and block grant formulas of federal money to states and localities.”

Vanita Gupta, President of The Leadership Conference on Civil and Human Rights, and former head of the Civil Rights division of the Department of Justice, responded that the Department of Justice has effectively enforced the Voting Rights Act using citizenship estimates from the American Community Survey. Former Attorney General Eric Holder, now head of the National Democratic Redistricting Committee, responded that “The attacks on the census process go beyond politics—they represent a major assault on representative American democracy” and that “Donald Trump and his Attorney General are rooting decisions in ideology, instead of expertise.”

As of this morning, Holder’s group and California Secretary of State Xavier Becerra have filed lawsuits against the Department of Commerce to stop the addition of the question. New York will also lead a multi-state lawsuit against the federal government. “Including a citizenship question on the 2020 census is not just a bad idea,” stated Becerra, “it is illegal.”


Solar and Wind Need a Larger Electric Grid—and California Might Just Create One

A grid operation control room. Photo: CAISO

Over the past decade, thousands of megawatts of clean renewable energy have been installed in the West thanks to the declining cost of wind and solar power and state policies like the Renewables Portfolio Standard (RPS). Since solar and wind power are by their nature intermittent, large quantities of weather-dependent generation require new solutions to maintain grid reliability while keeping costs low.

Right now, California relies heavily on natural gas to back up the grid during the times of day or seasons of the year when solar and wind power are not as readily available. But that’s not a sustainable solution both for climate change reasons, and because ramping the gas plants up and down frequently is not good for air quality or the people breathing that air.

In California, one better option to improve grid reliability, which is getting much attention from proponents and opponents right now, is regional energy market integration—also known as the creation of a western grid.

Who keeps the lights on today?

Grid reliability across the country is managed by entities called balancing authorities. These balancing authorities keep the lights on by balancing electricity supply and demand every four seconds. Regional reliability in the West is currently managed by 38 separate balancing authorities (Westerners like their independence, and that is reflected in the fragmentation of western balancing authorities). Compare the West to the Eastern Interconnection, where the majority of the power east of the Rockies is managed by six regional transmission organizations (RTOs).

There are 38 balancing authorities just within the western interconnection. Source: Western Electricity Coordinating Council

If balancing authorities can pool their resources together into one western grid, we will need fewer power sources to keep the system reliable. This has the added benefit of keeping costs low and making it easier to integrate more renewables into the electricity grid.

It’s not that there is zero coordination in the West today among the balancing authorities. California currently receives about 30% of its electricity from outside the California Independent System Operator (CAISO), the largest balancing authority in California. But as we increasingly rely more on solar and wind, the CAISO will need more tools in its toolbox to make sure we don’t rely too heavily on the California gas fleet to keep the lights on.

A regional market looks good from several angles

The tool currently being discussed at the California Legislature is the integration of the CAISO with other balancing authorities in the West to create a western independent system operator. Combining the resources of several western balancing authorities into a larger, more integrated regional energy market will make it possible to go farther and faster with renewable deployment in three important ways:

  1. Bigger, cleaner supply: A regional grid makes it easier to access flexible generation from a larger pool of resources, which will make it easier to meet electricity demands when renewables are not as plentiful (like when solar generation declines in the evening).
  2. Cheaper options: A regional grid helps California take advantage of low-cost renewables in other states where solar and wind make environmental and economic sense to build. More states equal more clean energy.
  3. Less waste: A regional grid helps California access a larger energy market to absorb the excess solar generation California customers cannot consume or is not cost-effective to store.
But is it a silver bullet?

There are lots of different ways California can create its low-carbon electricity future. There are efforts underway now to shift more electricity demand to times when renewables are most abundant, build more energy storage and local distributed resources to reduce congestion, make the grid more resilient, and reduce the need to rely on natural gas peaker plants, especially ones in disadvantaged communities. Efforts to establish a western regional energy market should not detract from these important efforts. We need all of it. And in some ways these strategies are even more important than a regional market for accomplishing certain climate and clean air objectives, like reducing the need to rely on specific in-state natural gas plants that provide local capacity.

Regional integration is not a silver bullet to our energy woes. But every day that we add more solar and wind power and come closer to reaching our climate goals, we need more flexibility on our grid to show the rest of the country and world that a clean energy transition can happen.

None of those other clean energy strategies can provide the level of flexibility that a regional energy market can.

What would a western energy market mean for coal?

California enacted a policy in 2006 called the Emissions Performance Standard which has helped to dramatically reduce the amount of electricity California receives from individual coal plants in the West. But coal generation also makes it into California’s power supply by way of unspecified market purchases for bundles of electricity, which is different than electricity purchases from specific plants. Unfortunately, today there’s not much we can do about that through direct California regulations.

However, one thing we can do, which has been the most important driver for coal plant retirements across this country, is expose that generation to more market competition from cheaper resources, like renewables. Having access to a western regional energy market is going to make it much easier for California to buy and build more renewables and help drive dirty coal off the market.

A regional grid also has the benefit of added transparency. Coal that currently makes it to California as unspecified power would have to be disclosed if the plants were located in a western ISO. Right now, our cap and trade program is forced to assign unspecified market purchases a carbon cost that reflects a lower carbon content than coal because we can’t actually see which plants are providing that generation. If we can see the coal, we can accurately assign its carbon value if that electricity is serving California load.

It’s complicated

I don’t want to give the impression that integrating western balancing authorities is an easy step. The CAISO board of governors is appointed by the governor of California and confirmed by the State Senate. If a western ISO is created, the board of governors will not all be California political appointees and that makes some people nervous.

But regional integration is a tool the CAISO needs. It will further expand the market for renewables and will help push California closer to our 100% clean energy goals.

Will Michigan’s Public Service Commission Stand Strong and Reject DTE’s Billion-dollar Natural Gas Gamble?

Photo: Walter/CC BY (Flickr)

Months of technical analysis have wrapped up. Thousands of pages of documents, testimony, and legal briefs have been submitted. And the entity charged with protecting ratepayer interests—Michigan’s Public Service Commission (MPSC)—has the proof it needs to stand up for ratepayers and the environment by rejecting DTE’s proposed $1 billion natural gas gamble. But will the Commission rise to the challenge?

Wind power in Michigan

As DTE looks to transition away from coal, clean energy—including renewables, efficiency, and demand response—are proving to be cheaper and cleaner. They’re also creating more jobs and economic development than natural gas. Photo credit: Flickr user RTD Photography

As DTE, Michigan’s largest utility, laudably looks to transition away from its historic overreliance on coal-fired power plants, it has regrettably turned its eye towards natural gas, figuring it can keep building large—and expensive—power plants that look good on its balance sheets while saddling ratepayers with the costs.

This appears to be DTE’s thinking as it has come to the MPSC seeking approval to build a 1,100 megawatt (MW) natural gas power plant with a cost to ratepayers of $989 million, including about 10 percent profit for its shareholders.

To get approval, DTE must prove that the plant is the “most reasonable and prudent” option for meeting customer demand. However, DTE fell flat in its attempt to do so. In fact, a counter-analysis submitted to the MPSC clearly showed that a combination of energy efficiency, demand response to reduce peak demand, and renewable energy could replace DTE’s retiring coal plants for less money, less pollution, and more jobs.

Michigan’s energy future shouldn’t be decided by shoddy, biased analysis

I wrote back in January about DTE’s analysis and the multitude of errors, inconsistencies, and biases that were discovered by parties (including UCS) that intervened before the MPSC to review DTE’s proposal. It’s a lengthy list that, by any objective standard, undermines DTE’s purported conclusion that its proposed natural gas plant is the best option for ratepayers.

We also used DTE’s own modeling tools to show how a combination of clean energy alternatives—renewables, efficiency, and demand response—could meet DTE’s power needs at lower cost while providing a more diverse, cleaner, and lower-risk portfolio of energy resources. Then last month a study completed by BW Research Partners (commissioned by UCS and Vote Solar) showed how this portfolio of clean energy resources would create more jobs and more tax revenues than DTE’s proposed project.

Other intervenors agreed with our assessment of DTE’s analysis. Even the MPSC’s staff agreed that DTE’s analysis of the proposed plant and alternatives fell short, finding they would have “preferred a more robust analysis”, that they “had concerns with DTE’s risk analysis”, that they are “concerned with DTE’s renewable energy and distributed energy portfolio”, and that DTE’s energy efficiency program is “bare bones.”

Yet, despite all those concerns, MPSC staff still recommended that Commissioners approve DTE’s application because DTE has “minimally complied” with the law.

Is this how we’re going to make billion-dollar investment decisions that will impact ratepayers and the environment for decades to come? I hope not. DTE, and all of Michigan’s monopoly utilities, should be held to far higher standards.

DTE on clean energy: I’ll gladly pay you Tuesday for a hamburger today Installing solar panels in PA

Actions speak louder than words. DTE’s proposal to spend the next 10 years building natural gas plants doesn’t square with its talk about addressing climate change or investing in renewable energy. Photo: used with permission from publicsource.org

You never want to be compared to Popeye’s friend Wimpy, but that’s exactly DTE’s approach to clean energy—give me what I want now and I “promise” to take care of you later. That simply isn’t good enough when it comes to clean energy and addressing climate change. DTE has made strong statements on the growing threat of climate change and pledged to cut it carbon emissions by 80 percent by 2050. It has pushed back on a looming ballot initiative that would increase the state’s renewable energy standard to 30 percent by 2030 by claiming that the company is already planning to invest heavily in renewable energy.

But here’s the rub: the plan they put before the MPSC would put them in a constant state of build. Not renewables, but natural gas. First the $1 billion plant that it wants approval to start building now to be completed in 2023. Then another $1 billion plant it wants to build starting in 2024 to be completed in 2029.

The majority of DTE’s renewable energy investments wouldn’t happen until after 2030. In fact, under DTE’s plan, only 11 percent of its energy would come from renewables by 2025. If historically low natural gas prices go up, ratepayers will bear the burden—not DTE. If we as a nation get serious about climate change and these plants must be dialed down or shuttered, that would be on ratepayers too. But DTE is asking for its hamburger today and promising it will build lots of renewables on Tuesday.

Time for the MPSC to step up and set a strong precedent

When utilities come to the MPSC asking for $1 billion in ratepayer dollars (including a guaranteed 10 percent rate of return), the MPSC must hold them to a high standard of proof that the investment is a smart one for ratepayers. DTE failed to do that and should be sent back to the drawing board to take a more serious look at all the reasonable alternatives.

Robust analysis and careful planning are at the heart of ratepayer protection. Particularly when we are in the midst of an energy system transformation, the MPSC’s role in holding utilities accountable for sound decision-making is as important as ever. Let’s hope they take that role seriously.

Public Source

How Did Climate and Clean Energy Programs Fare in the 2018 Federal Budget?

Late last night the Senate passed the fy18 omnibus spending package to keep the federal government running through September. The bill is a complete repudiation of President Trump’s budget priorities, especially on climate change and clean energy.

In fact, I’d argue that the “art of the deal” approach the administration took in negotiating with Congress over the budget numbers (pushing overly draconian cuts in the hope that Congress would move slightly closer in their direction) proved to have the opposite effect. It galvanized Congress in opposition to the president’s budget priorities and solidified bipartisan coalitions in support of specific programs and agencies, proving once again that bullying Congress on funding is not an effective strategy for the executive branch to take.

The administration’s interests would have been better served working in partnership with Congress—a lesson this president clearly has not learned given his fy19 budget request.

Here’s how some important Climate and Clean Energy Programs fared in the fy18 omnibus:

What does this tell us?
  • Clean energy R&D (and energy efficiency) still matter to both Republicans and Democrats. It’s not so much a climate thing as it is a local thing, an energy security thing, and a “pro-growth” strategy.
  • Climate change (climate science) has become so politicized on the hill that Congress doesn’t want to touch it and instead defaults to continued funding without increases or cuts. While some may see level funding as a victory (especially in this political environment), we know climate change is a growing and serious threat to our economy and national security, and therefore climate science should truly necessitate increased priority and federal support.
  • People are feeling the impacts of a changing climate (especially extreme weather). Both Democrats and Republicans see the logic in investing up-front to be more prepared and save cost and heartache on the back end.

The president signed the bill shortly after a brief veto threat. The budget reflects the fact that science advocacy matters, but it’s also a reminder that we need to be vigilant in our work to depoliticize the issue of climate change and continue to work in strong bipartisan fashion to advance shared goals around clean energy deployment and innovation, as well as community resilience to extreme weather and other climate impacts.

The New Government Omnibus Spending Bill Shows that Science Advocacy Matters

After a long wait, late last night, Congress posted a spending agreement for the rest of the 2018 fiscal year. For the most part, we achieved significant victories, especially given the challenging political environment, in repelling proposals that would have directly undermined the role of science in public health and environmental policymaking.

Among the highlights:

Proposed Actual Eliminate the Chemical Safety Board, which investigates chemical accidents and issues recommendations to protect the public Chemical Safety Board is fully funded Prohibit science-based Endangered Species Act protections for the lesser prairie chicken and wolves in Wyoming and the Great Lakes region No new prohibitions were included Legislatively weaken the most recent science-based ground-level ozone pollution standard The ozone standard was left intact Exempt clean water protections from scientific, public, and legal scrutiny The EPA will be required to follow the normal process as it works to withdraw the Clean Water Rule. Eliminate EPA’s Integrated Risk Information System (IRIS) which evaluates the health impacts of toxic chemicals and produces vital scientific assessments for federal, state, international, and community groups to help assess risks due to exposure IRIS remains fully funded and a part of EPA’s research division, the Office of Research and Development Slash EPA’s budget up to 30%, including significant reductions for EPA’s Office of Environmental Justice, Office of Research and Development, and enforcement programs None of these programs received any cuts Significantly cut other science agency budgets, including NASA, NOAA, USDA, and the Department of Energy None of the agencies received reduced overall funding, and some agencies saw modest, and in some cases significant, increases. You made this possible

UCS, along with our coalition partners, has been working for the better part of the last year to ensure that the final spending agreement protects the budgets of federal science agencies and excludes any anti-science “poison pill” riders, or policy provisions that have no business being in spending bills. But we are only a few people.

We needed the support of Science Network members, Science watchdogs, and Science Champions across the country to make this possible by bringing home the local impacts of the harmful proposals. Together, we were able to thwart all the harmful policies, cuts, and program eliminations listed above.

We even made some small strides forward. For example, Congress clarified that CDC scientists can conduct gun violence research—something they have effectively been prohibited from doing for more than 20 years. Separately, Congress will now be required to post Congressional Research Service reports (reports on policy issues that are completed by Congress’s research arm) on the Internet. This will mean better public access to nonpartisan, taxpayer-funded research and ensure transparency, something UCS has long been advocating for.

With support from UCS, scientists and science advocates explained to their elected officials and local press how these dangerous cuts and anti-science riders would negatively impact their state through sign-on letters and a steady drumbeat of meetings and conversations with local staff. Furthermore, they worked hand in hand to ensure their representatives in Congress understood that science-based public protections are a priority in the federal budget. Supporters took action more than 47,500 times through emails, letters, social media, meetings, op-eds, and phone calls expressing their strong opposition to any final spending deal that cut federal funding of science agencies and/or included harmful anti-science poison pill riders.

In the current environment, we don’t often get great news from Washington. We have frequently seen the EPA and other science agencies roll back or weaken science-based safeguards, or Congress try their best to weaken evidence-based decision-making. Last night, however, our allies in Congress protected the federal science budget, fought off some of the worst anti-science proposals that were on the table, and made additional policy improvements because you persuaded them that these issues should be top priorities.

We need to keep pushing

While this funding agreement is a good first step, the battle continues. It is still unconscionable that Congress was unable to include protections for Dreamers but include some funding for a border wall. And there were some other anti-science poison pill riders that snuck through, including continuing a prohibition on protecting the imperiled sage grouse and continuing to legislate science by declaring that biomass is inherently carbon-neutral (it isn’t).

But we have another opportunity to fight just around the corner. Congress will soon begin work on funding for the 2019 fiscal year, and you can be sure many of these same issues will find their way into negotiations.

We know that politicians want to go down the road of least pain. When constituents speak up for science, lawmakers do listen. The squeaky wheel does get the grease.

That’s why it’s critical to make your voice heard and keep a steady drumbeat going. Lawmakers will return to their districts next week to begin a two-week in-district work period. Take that opportunity to engage with your representatives and senators and tell them what you liked about this spending agreement and what you want to see continue, or improved upon, in the next funding deal. Let’s keep the momentum going by continuing to call, email, write letters, and tweet your elected officials!

How do big oil companies talk about climate science? Four takeaways from a day in court.

Photo: WClarke/Wikimedia Commons

In front of a standing room only courtroom audience, the case of The People of California vs. B.P. P.L.C. et al. took an important step forward yesterday. In this case, the cities of San Francisco and Oakland, CA, are aiming to hold five major fossil fuel companies responsible for climate damages, particularly with respect to sea level rise. In a federal court in San Francisco, the presiding Judge William Alsup had specifically asked both sides to present a “tutorial on climate science” and to address eight questions he had posed. So how did the big oil company defendants present their version of climate science? And how did it compare to the scientific consensus? Together with my UCS colleague Deborah Moore, Western States Senior Campaign Manager, I was lucky enough to get a seat in the courtroom. Here are four of our takeaways from the day:

1. Judge Alsup was highly engaged with the presenters from each side

The plaintiffs had three renowned scientists present their tutorial: Dr. Myles Allen of Oxford University, Dr. Gary Griggs of the University of California at Santa Cruz, and Dr. Don Wuebbles of the University of Illinois. The defendants had one representative–Chevron lawyer Theodore Boutrous–presenting. Alsup interrupted each presenter many, many times to get clarification, to dissect a chart or graph, or to ask additional questions. I came away with the sense that Alsup truly wanted to understand the causes and consequences of climate change, and it is great to see such engagement.

2. The Fifth Assessment Report done by the IPCC no longer fully reflects the most current scientific consensus on climate change

For the defendants’ presentation, Mr. Boutrous relied almost entirely on results from the 2013 Intergovernmental Panel on Climate Change (IPCC) report. He opened his portion by stating that Chevron accepts the scientific consensus on climate change represented in the IPCC report, including that humans are the primary cause of observed warming in recent decades. This, he said, has been the company’s position for about ten years. He then walked Judge Alsup through a series of slides highlighting conclusions, as well as uncertainties, from the IPCC report. By relying so exclusively on the IPCC report, he bolstered his claim that Chevron’s views are in line with mainstream science, but also exposed just how much climate science has progressed since the report was released in 2013.Among the many major scientific developments of the last five years are:

  • A greater understanding of the potential contribution of the Antarctic Ice Sheet to sea level rise this century;  and
  • The ability to rigorously attribute virtually all observed warming since the mid-1900’s to human activity, and a portion of the observed warming and sea level rise to the products of specific fossil fuel producers. As Mr. Boutrous went through his presentation, I was struck by how much the consensus view has sharpened in the last five years. A few minutes later, Dr. Wuebbles began his presentation, the final one for the plaintiffs, by explaining, from the perspective of a lead author for both the 2013 IPCC Fifth Assessment and the US Fourth National Climate Assessment issued in 2017, that “science didn’t stop in 2012.” He then proceeded to highlight results from the 2017 report, which is the latest and most comprehensive assessment of the state of climate science for the U.S.

In UCS’s 2016 Climate Accountability Scorecard, Chevron scored “poor” on acknowledging climate science. So it was a big step for Chevron to state, on the record, that it accepts the scientific consensus on climate change. But since the IPCC Fifth Assessment Report, the trends have become clearer and our ability to attribute climate change to human activity has progressed. So accepting the consensus view as of five years ago is simply not sufficient.

Lawyers, reporters, scientists, and others lined up at 7 am to get into the courtroom for Judge Alsup’s climate science tutorial

3. Chevron continues to highlight uncertainties and cherry-pick information

While Mr. Boutrous did rely almost entirely on information and graphics from the 2013 IPCC report, many of those graphics were chosen carefully to highlight uncertainty or sow seeds of doubt about the reliability of the underlying scientific studies and the severity of the predicted impacts.

For instance, Mr. Boutrous showed projections for future warming from a suite of climate models and highlighted that some models are overly sensitive to changes in carbon dioxide concentrations and likely overestimate future warming. Later, when showing sea level rise trends globally, Mr. Boutrous highlighted a roughly decade-long period when sea level in the San Francisco area was relatively unchanging, which deliberately ignores the consistent long-term rise here and around the globe.

In addition to this questionable presentation of the data, Chevron repeatedly tried to downplay the role the fossil fuel industry plays in exacerbating climate change—by pointing to language in the IPCC report that states that population and economic growth are the drivers of increasing carbon emissions. Yes, as the world’s population grows, emissions rise because fossil fuel use increases.

But multiple investigations have uncovered evidence showing that the fossil fuel industry funded a decades-long climate science disinformation campaign to block policies that would reduce carbon emissions, and actively promoted its products to ensure fossil fuels would remain central to global energy production. Chevron continues to dismiss and deny climate risks and fund trade associations and other industry groups that still spread climate disinformation or block sensible climate policies.

When questioned by Judge Alsup on the finer points of the graphics he was showing, Mr. Boutrous was often forced to admit that his scientific understanding of the issue was limited and that he could not answer. It was striking that the oil companies chose a lawyer to present their scientific narrative, and the choice contrasted sharply with the deep scientific knowledge that the plaintiffs brought to the table.

4. Science has a key role to play in public nuisance cases, and scientists are stepping up to the plate

As graduate students, many of us climate scientists were told to be wary of wading too close to the politics of climate change, that we’d best stick to the science. Yesterday, three very prominent scientists stuck to the science, but used scientific information to establish that fossil fuel burning has already and will increasingly harm public well-being. Rather than putting “climate science on trial,” Judge Alsup’s climate science tutorial provided the case with a strong scientific underpinning that can help support making a determination, based on a set of legal standards and precedents, about the liability and responsibility of big oil companies.

Deborah Moore, Western States Senior Campaign Manager at the Union of Concerned Scientists, contributed significantly to this post.

Deborah Moore

This One Policy Would Provide Billions to Protect Massachusetts from Climate Change

Photo: MBTA

As Massachusetts residents dig themselves out of the fourth Nor’easter in the past three weeks, policy leaders on Beacon Hill are beginning to dig in to some of the critical questions that will determine the future of the Commonwealth in an era of climate change.

Questions like:

  • How do we protect ourselves from the impacts of more intense storms, sea level rise, and increasing flooding from storm surges that are certain to continue to plague our state over the coming decades?
  • How do we build a transportation system that is clean, resilient to the impacts of climate change, fiscally and ecologically sustainable, equitable, and capable of handling the exploding growth in the Boston metro area?
  • And bottom line: how are we going to finance the kind of investments in infrastructure and technology that will be necessary to protect our state and achieve the requirements of Massachusetts climate law?

The good news is that there is a bill in the Massachusetts legislature that has a lot of great ideas for how to move the Commonwealth forward. Many of these ideas have already been covered well by others, including our own John Rogers, as well as David Ismay at Conservation Law Foundation and Ben Hellerstein at Environment Massachusetts.

In this post, I want to talk about one of these ideas, a policy that if enacted could represent one of the most profound changes in Massachusetts climate policy in a decade. That is the requirement that the state enact “market-based compliance mechanisms” to address climate change.

If you’re like most people, you are probably asking yourself: what the heck does that mean?

It means cap and invest. And this provision could unleash over $750 million per year in funding to address some of the state’s critical transportation, energy, and infrastructure needs.

A brief overview of the GWSA

Let me explain.

In 2008, the Massachusetts legislature unanimously passed a law called the Global Warming Solutions Act (“GWSA”). This law requires the state to reduce emissions by at least 80% of 1990 levels by 2050. It also required the state to set a limit for 2020: in 2009 the state set a limit of 25% from 1990 levels by 2020.

The GWSA, which clocks in at about six pages, does not specify exactly what policies should be enacted to reach these limits. Instead, the GWSA requires executive agencies to figure it out. This strategy, known as cap-and-delegate, is a common approach to addressing climate change. It allows executive agencies to take advantage of their superior technical knowledge and expertise in crafting energy policy. Indeed, Massachusetts’ GWSA is very similar to California’s cap-and-delegate statute, also entitled the Global Warming Solutions Act, although California’s law is more commonly referred to by it’s Assembly Bill number, AB 32.

One obvious question that has dogged the GWSA from the beginning is: what happens if our plan isn’t good enough, and we fail to achieve our limits? This question is particularly vexing because given the speed at which this information becomes available, we will not know whether we made the 2020 limit until 2023. And it’s important to address, because as we look to 2030 we are going to need to make progress in areas such as transportation and heating that have proven challenging thus far.

Market-based compliance mechanisms

The GWSA provides one tool that could help ensure compliance with the statute: the state could enact market-based compliance mechanisms. That means doing three things:

  • Establishing a limit on pollution;
  • Requiring companies that pollute to purchase allowances from a limited pool made available by the state; and
  • Investing the money we generate from these auction sales in efficiency and clean energy.

This is the strategy that the GWSA calls “market-based compliance mechanisms”, the world calls “cap-and-trade” and we call, most accurately, cap-and-invest. It represents a simple, elegant solution to the challenge of reducing aggregate emissions from across broad sectors of our society. It has been used all around the world by countries, states, and provinces looking to reduce emissions and raise money for climate solutions.

We have a cap-and-invest program in Massachusetts. It’s called the Regional Greenhouse Gas Initiative, or RGGI. It’s the funding source for many of our most popular and important climate programs, such as MassSave and the Green Communities Act. It has helped save consumers $600 million on their energy bills, produced over $1 billion in health benefits for our state, and created over 2,000 jobs.

But RGGI only applies to power plants. Today, the largest source of emissions in the state is transportation, with heating homes and businesses close behind. Other jurisdictions, including California, Ontario, and Quebec, have expanded this cap-and-invest model economy-wide, and the result has been billions in new funding for clean transportation and energy projects. It’s time for Massachusetts to do the same.

What would economy-wide cap-and-invest do for Massachusetts?

The bill in the legislature would allow the administration to consider a couple of different approaches to expanding cap-and-invest to transportation and heating.

One possibility, suggested in a recent op-ed by Senator Stan Rosenberg, would be to create a “state-based, market-driven approach to the use of carbon.” Another possibility is that the state could join with the other RGGI states in launching new cap and invest programs modeled after RGGI covering transportation and heating fuels. A third possibility would be for Massachusetts to join with California, Ontario, and Quebec’s program covering transportation and heating fuels.

But either way, cap and invest could be a funding source for climate solutions on a scale that we have never seen before in Massachusetts.

For example, if Massachusetts were to take the California-Ontario-Quebec path, at current auction values it would raise over $750 million that we could invest to reduce emissions and protect the state from climate change. Over $450 million of that would be from transportation fuels, which we could use to fund projects that improve public transportation, encourage electric vehicles, and make our transportation infrastructure more resilient. $300 million would be from heating fuels and other industrial uses, which could be invested in efficiency and new technologies such as heat pumps.

By the way, we could do this without legislation

The proposal in the legislature would require the administration to enact an economy-wide cap and invest program. But if executive agencies want to move forward with market-based solutions to climate change, there is no reason to wait for new legislation: the GWSA already provides the authority for the administration to implement market-based solutions, either on our own or in partnership with our partners in RGGI states and Canadian provinces.

Achieving the limits of the GWSA means ending an era where fossil fuel companies can produce unlimited quantities of pollution for free. Bringing all pollution sources under a market-based cap is a critical next frontier of climate policy in Massachusetts. With the state increasingly focused on how we can make investments in infrastructure to support our transportation system and protect our state from climate impacts, now is the time to take this step and protect the Commonwealth from climate change.

Photo: MBTA

Scott Pruitt Will Restrict the EPA’s Use of Legitimate Science

The EPA is reportedly on the verge of restricting the science that EPA can use in decision-making and I’m livid.

This is a move that serves no purpose other than to prevent the EPA from carrying out its mission of protecting public health and the environment. If Pruitt’s proposal looks anything like House Science Committee Chairman’s HONEST Act or its predecessor the Secret Science Act, we know it will be nonsensical and dangerous for our nation’s ability to use science to protect people. Those bills required that all raw data, models, code, and other materials from scientific studies be made available to the public before the EPA could use it and it had sweeping scope over EPA actions, covering “risk, exposure, or hazard assessment, criteria document, standard, limitation, regulation, regulatory impact analysis, or guidance.”

Here are the top ways EPA Administrator Scott Pruitt’s Trojan horse “transparency” proposal is fundamentally flawed:

It fundamentally misrepresents how science works

You might not need a refresher on how science works, but it’s clear that Administrator Pruitt does. Here’s a quick run-down: In order to be published in a scientific journal, research must pass through peer-review where two or three experts familiar with that field will critique the scientific merits of the study. When a study has passed peer review, we know it has met a standard set by scientists in that field. Federal agencies like the EPA then use that peer-reviewed science in order to issue science-based rules.

Nowhere in this process do decisionmakers need to see raw data that went into studies in order to trust scientific evidence. Scientists conducting the peer review don’t even typically see the raw data of studies. They do not need to. They can look at the methods, design, and results in order to assess the quality of the science. The peer review process—conducted by those with scientific expertise—provides the necessary scrutiny here; the scrutiny of Congress would insert politics into what should be a scientific discussion.

It solves a problem that doesn’t exist

Let’s be clear. The decision-making process at the EPA is already exhaustingly transparent. There are thousands of pages of documents and hours of phone calls and meetings of scientific experts discussing technical details of those documents—and the public has full access to these discussions! I know. I’ve listened to hours and hours of meetings and read hundreds of pages of documents. I would never say that a problem at the EPA is a lack of access to the details of agency decision-making.

For example, the EPA claims that “EPA has primarily relied on two 1990s studies linking fine particulate pollution to premature death. Neither studies have made their data public, but EPA used their findings to justify sweeping air quality regulations.”

This is ludicrous. On the contrary, the latest Integrated Science Assessment for particulate matter (i.e. the summary of the scientific basis for the latest air pollution protections from soot), cites more than 800 studies—including the two studies referenced, which were peer-reviewed and have since been re-assessed to further confirm their scientific validity.

Further, the EPA already painstakingly collects scientific data and other details from the studies that it relies on to make policy decisions. I know because they asked me for it. The EPA’s 2015 decision on a revised ambient ozone standard relied on many studies of ozone pollution and its relationship with health outcomes, including work that I did as a doctoral student at Georgia Tech looking at exposure measurement in ambient air pollutants.

Even though I had conducted the study several years earlier as a graduate student, EPA scientists tracked me down and got me to dig through my files and find the original data that supported the figures and conclusions of my study so I could share it with the agency. If that isn’t dedication to scientific integrity in science-based policy, I don’t know what is.

It wastes taxpayer dollars and adds red tape

Ironically, the bill is directly at odds with the Trump administration’s stated desire to create a more efficient government. It adds unnecessary and burdensome redundancy to the process of keeping clean our air, water, and land. Pruitt is adding red tape to the federal government, not reducing it.

It is also wasting taxpayer dollars. Last year, the Congressional Budget Office estimated that it would cost the EPA an additional $250 million per year to comply with Chairman Smith’s HONEST Act. But the administration wasn’t even honest about that.  It was revealed that EPA leadership claimed the move would pose no additional burden, burying the comments of EPA staff, who asserted the tremendous cost of implementation and also noted that the bill would threaten EPA expertise, jeopardize personal and confidential business information, and “significantly impede EPA’s ability to protect the health and the environment of Americans.” The greater scientific community backed up this assertion. A letter signed by 23 scientific societies and academic institutions also raised concerns the bill would “constrain the EPA from making a proposal based on the best available science.”

Pruitt is taking a page from the tobacco industry playbook

This was never an honest proposal. Pruitt’s move is just another tactic dreamed up to attack science behind public health protections. In fact, it was first noted in internal documents from the tobacco industry. In a 1996 memo out of R. J. Reynolds Tobacco Company, industry consultant (and later part of the Trump administration’s landing team at EPA), Chris Horner, laid out the “secret science” strategy as a way to fend off tobacco regulations as the science increasingly showed the harms of secondhand smoke.  The goal, he wrote, was “to construct explicit procedural hurdles the agency must follow in issuing scientific reports.” This has never been about transparency in science-based decision-making.

It’s dangerous

Administrator Pruitt claims to be worried about “secret science” at EPA, but in reality, he’s squashing the science that protects Americans from air, water, and land pollution. When the EPA can’t rely on scientific evidence to make decisions about public health protections, we are all left in the dark.

This post originally appeared in Scientific American.

Pseudoscience on Trial: The Spectacular Fall of President Trump’s Voter Fraud Thesis

On January 3, 2017, President Trump claimed that there was “substantial evidence” of voter fraud in the 2016 election, enough to have denied him a popular vote victory. The substance of this now infamous claim, that millions of non-citizens committed voter fraud, was examined closely in the just-concluded trial of Kris Kobach, Kansas Secretary of State, current gubernatorial candidate, and co-chair of the Electoral “Integrity” Commission that the president established, then abruptly dissolved when it faced legal accountability.

The Kansas trial that just concluded concerned a 2013 law initiated by Kobach that required prospective voters to provide proof-of-citizenship documents. Over the course of the two-week trial, Kobach presented all available evidence of the extent of voter fraud in an effort to justify the law, which has prevented as many as 33,000 eligible Kansas residents from registering to vote.

The voter fraud thesis fell apart in truly spectacular fashion under examination, and could very well result in the overturning of the law, the denial of one witness as an “expert” in future testimony, and even a finding that Mr. Kobach be held in contempt of court. To understand how things could have possibly gone so badly for Mr. Kobach, consider some the highlights of the trial, wherein the “science” used to claim that voter fraud is rampant dissolves before our eyes, much like the Kobach Commission:

  1. Hans Von Spakovsky, a fellow member of the Kobach Commission, had to acknowledge early on that his research on voter fraud has not been subjected to peer review, and further acknowledged that all of his inferences about voter fraud in Kansas were based on a spreadsheet provided by Mr. Kobach.
  2. Regarding the frequent comment that known accounts of voter fraud are “just the tip of the iceberg,” lead counsel for plaintiffs Dale Ho asked “You don’t have any estimate of the size of the iceberg, is that right Mr. Von Spakovsky?” Von Spakovsky: “That’s correct.”
  3. Cross-examination also revealed that Von Spakovsky’s submitted court report contained incomplete information that made it possible for him to inflate estimates of non-citizen registration. Subsequently, plaintiffs asked Judge Julie Robinson to make a finding that Von Spakovsky is not an objective expert, having offered incomplete and misleading testimony.
  4. Professor Jesse Richman, who did co-author a peer-reviewed publication in Electoral Studies on non-citizen voting, took the stand and stated that “Trump and others have been misreading our research and exaggerating our results to make claims we don’t think our research supports.” (Note: Subsequent analyses of the Richman et. al. research has shown that response error accounts for nearly all of their estimated frequency of non-citizen voting. That’s how peer-review and science works.)
  5. During Richman’s testimony that up to 18,000 non-citizens have registered or tried to register to vote in Kansas, he acknowledged that one of the methods he used was to flag “foreign-sounding” names. When asked if he would flag the name “Carlos Murguia” Richman said yes. When informed of the fact that Carlos Murguia is a Kansas-born federal judge who sits in that courthouse, Richman said that he was not aware.
  6. It gets better. Pat McFerron, a “pollster” hired by Kobach to survey public opinion about the difficulty of meeting the law’s requirements and support for the law, had to acknowledge under cross-examination that claiming that the law was based on “evidence of non-citizens registering to vote” introduced bias to survey respondents.
  7. Further, when asked to rate the difficulty of getting necessary documents to register, respondents were told that this was required by law. When Richman was then asked if he understood the effects of social desirability bias in question wording, he could not provide an answer. Remember, this is the survey expert.
  8. Actual pollster and professor of political science Matthew Barreto summarized many of the other problems associated with McFerron’s methodology to Judge Robinson, including the fact that if you want to know how hard it is for unregistered people to register, you need a representative sample of people who are not already registered.
  9. On cross-examination, Barreto also explained to the defense that you should not use the number of registered voters from one year and the eligible population from another year if you want to accurately estimate registration rates. The judge, referring to Kobach’s estimates, explained “I can tell you if that’s the methodology, I’m giving that number absolutely no weight. That’s ridiculous.”
  10. In the end, Kobach could only show that his analysis of Sedgwick County revealed 18 non-citizens who had registered to vote, five of whom had voted, over a 20-year period.

The last day of the trial addressed one question, whether Mr. Kobach failed to comply with a 2016 order issued by Judge Robinson to fully register thousands of voters who had registered through the DMV but not provided proof of citizenship. When Judge Robinson discovered that Kobach had failed to ensure that these voters received the same postcard about their registration status, her frustration was clear: “I honored and trusted what you told me, Mr. Kobach. If you tell me you’ve done something, I trust that. That’s why lawyers are licensed.”

Mr. Kobach may try to keep the voter fraud myth alive through appeal, but the pseudo-scientific claims that voter fraud is rampant have been thoroughly discredited.



WIN: Congress Cracks Open Door for Gun Violence Research to Resume

Photo: MarylandGovPics/CC BY 2.0 (Flickr)

Just three days before the March for Our Lives, Congress has opened the door for federal research into gun violence to resume. In a spending bill to provide funding for the federal government through the rest of the fiscal year, Congress has clarified that the CDC is able to pursue research to help stop gun violence.

Legislative language in place since 1996 has effectively prevented CDC and NIH researchers from exploring questions that would help us make more informed decisions about ways to reduce gun-related suicides, domestic abuse, and yes, mass murder (see the National Academies’ list of research priorities here). CDC scientists and public health advocates have been pleading for years for the ban to be lifted. The original sponsor of the amendment, the late Republican Congressman Jay Dickey, also fought the ban in recent years.

After the Parkland killings, both Republicans and Democrats publicly recognized the need to lift the research ban. In particular, Secretary of Health and Human Services Alex Azar said the department will be “proactive” on gun violence research.

He will need to be. After the Sandy Hook massacre, President Obama directed the CDC and NIH to resume research through an executive order. Yet with the Dickey Amendment still in place, nothing happened.

Although the Dickey Amendment will remain in place, we are now a step closer to lifting the federal ban on gun research as lawmakers have acknowledged that federal scientists can and should investigate this public health crisis. Congress’ next move is to provide dedicated funding for this research. In the meantime, Secretary Azar should ensure that the next CDC director commits fully and expediently to developing and implementing a research agenda that helps our country address gun violence.

Setting everything aside—the failure to provide a solution for Dreamers, the fact that most members of Congress won’t have time to read the bill before voting on it, and the inclusion of a select few anti-science poison pill riders—at least there’s something to celebrate.

Photo: MarylandGovPics/CC BY 2.0 (Flickr)

What to Look For in Tomorrow’s DOE Budget Hearing

On Thursday morning, the House Subcommittee on Energy and Water Development and Related Agencies (within the Committee on Appropriations) will hold a hearing on applied energy funding for the FY 2019 budget. (The FY 2018 budget, which goes until the end of September, is being finalized this week in order to avoid a government shutdown on Friday.) We’ll see a parade of undersecretaries and assistant secretaries of the applied energy technology offices within the Department of Energy (DOE)—all of whom are political appointees—attempt to justify their boss’s proposal seeking to gut R&D funding for clean energy and low carbon technologies.

The President’s budget takes aim at applied energy programs

It’s pretty clear that the president’s proposed budget, similar to last year, seeks to drastically reduce funding for applied energy research, development, and demonstration (R&D). And the cuts extend beyond the nearly 2/3 reduction in R&D funding for Energy Efficiency and Renewable Energy—the budget slashes funding by 26 percent for advanced fossil technology like CCS and 26% for advanced nuclear R&D. The Office of Electricity Delivery is hit hard, with its program on Energy Storage facing a 74 percent reduction to its already paltry budget.

For the second year in a row, the administration is proposing to ax ARPA-E, something even Secretary Perry opposes. ARPA-E enjoys strong bipartisan support because the agency is advancing transformational energy projects that can potentially and radically improve U.S. economic strength, national security, and environmental outcomes. And finally, the president’s budget eliminates the Loan Guarantee Program, even though the program has generated more than $1.79 billion in interest repaid to the US Treasury.

Program FY 2017 Enacted FY 2018 Senate Proposal FY 2018 House Proposal FY 2018 President’s Request FY 2019 President’s Request Proposed FY18 cut Proposed FY19 cut Basic Energy Sciences $1,872 $1,980 $1,872 $1,555 $1,850 -16.9% -1.1% EERE $2,035 $1,937 $1,104 $636 $696 -68.7% -65.8% Office of Electricity $150 $213 $219 $120 $61 -20.2% -59.2% Energy Storage Program $31 $31 $31 $8 $8 -74.2% -74.2% ARPA-E $305 $330 $- $20 $- -93.4% -100.0% Fossil R&D $682 $573 $635 $335 $502 -50.9% -26.4% Nuclear Energy $1,016 $917 $969 $703 $757 -30.8% -25.5% All Dollars in Millions.

The table above shows the current funding levels (FY 2017 Enacted refers to the final numbers passed by Congress through continuing resolutions); the House and Senate committee reports for FY 2018 (likely to be out of date by the end of the week); and the president’s requests for both FY 2018 and FY 2019.

Grid scale energy storage has the potential to change the way the grid functions—with positive benefits for society. But the technologies aren’t quite commercial scale, yet. Most of DOE’s funding for energy storage RD&D comes through the Energy Storage program within the Office of Electricity, various programs and cross programmatic initiatives within EERE, and ARPA-E. And yet, the president is proposing a 74 percent reduction in the Energy Storage program, complete elimination of ARPA-E, and a 66 percent cut to EERE.

Why such steep cuts for all these applied technology programs? As I’ve written previously, the administration continues to drive an ideological wedge between basic and applied research, based on the false premise that the private sector can pick up these technologies and move them to commercialization.

Is there any evidence that the private sector will pick up the slack?

In a word, no.

Funding for energy RD&D is a wise investment for the federal government—to maintain (or regain) US competitiveness in innovation, and to ensure that new technologies are able to reach the market instead of withering on the vine. This piece explains why; in a nutshell:

  • Venture capital is not flowing into energy projects (only 2 percent of venture capital went to energy projects in 2016) partly because these projects often require larger up-front costs;
  • Established energy institutions are risk averse, and in the case of regulated utilities, have a guaranteed rate of return—utilities, for example, generally spend only 0.1% of revenue on R&D.

By investing in energy RD&D (both basic and applied), the federal government unlocks private funding, creating a healthy innovation ecosystem for energy technologies. From solar power, to wind power manufacturing, to the shale gas revolution, and more—DOE has been an “indispensable partner in American energy innovation.”

The President cedes leadership to China

Let’s take our federal investments in energy storage as an example. As the president backs away from RD&D for energy storage, other countries are stepping up. China, India, Germany, the UK, Canada, and Australia have dedicated policies and strategies to advancing energy storage. Elsewhere around the world, policymakers see the value of energy storage and want to be part of a global market that is set to double six times by 2030.

China, for example, is making enormous government investments in storage. Chinese officials see these investments as strategic, and the country is poised to be the clean energy leader of the next decade and beyond. Last fall China published a national plan on the development of the storage industry. Chinese companies already control global markets for key battery components, and China is set to be a global superpower in storage technologies in the 2020s.

The president is proposing that the U.S. simply fold—and the stakes are high: nothing less than who will hold the jobs of the future.

The Administration’s budget will hurt our National Laboratories

Our system of 17 national laboratories “have served as the leading institutions for scientific innovation in the United States for more than seventy years,” according to the DOE’s website. They also serve as anchor institutions that are critical to local economic development—and simultaneously training the next generation of scientists and engineers. Secretary Perry said in January, “DOE’s 17 laboratories are the crown jewels of American science.” For all these reasons, people across the ideological spectrum agree that our nation’s National Labs are critical to innovation and to our nation’s competitiveness.

The thing is, the national labs are funded by government agencies—primarily DOE, to include both Basic Energy Sciences in the Office of Science as well as the applied technology offices in the Office of Energy. This means that federal budget cuts can translate into cuts—and job losses—at national labs.

In Colorado, the National Renewable Energy Laboratory (NREL) receives about three-quarters of its funding from EERE, according to a report last year on the State of the DOE National Laboratories. The uncertainty created by the administration’s proposed budget is leaving NREL’s 1,700 employees, plus hundreds of contractors, interns, and visiting researchers, in limbo about their future.

In Tennessee, Oak Ridge National Laboratory eliminated 350 jobs in 2017, although it’s unclear how much of that had to do with proposed budget cuts.


Any appropriator will tell you that a president’s budget is basically meaningless, because Congress holds the purse strings. Let’s hope these appropriators continue to recognize the value of applied energy technology and hold the administration’s feet to the fire.

Energy Storage is the Policy Epicenter of Energy Innovation

A darkened Manhattan after Hurricane Sandy. Photo courtesy of David Shankbone.

Right now, the reliability and economics of the electric power grid is changing. A major player in this change might be energy storage. Utilities have always known that storing electricity is valuable, but other than building dams to hold water, it wasn’t a real option. But battery advances—some from government-funded R&D for vehicles, some from laptops and cellphones—have opened a door.

How will utilities and regulators know what to do with battery energy storage?

When the utility industry has gradually seen enough research, testing, safety standards and performance assurances, then they will have the confidence to adopt this new technology. The common experiences with batteries for storing energy are not perfect. Think about your cellphone’s battery. But where a national interest is made into policies or funding, the needed gaps are getting attention.

Where is this happening?

China, India, Germany, Japan, UK, Canada and Australia all have dedicated policies and strategies to bring on the reliability and efficiencies of grid energy storage.

And in the US?

The US Department of Energy has run a small and effective R&D program that leverages the funding of states (Alaska, California, New York and Washington in particular), utilities, and private companies relying on the expertise and staff of national labs in Idaho, New Mexico, Tennessee and Washington. For a look at the budget of this effort, see my colleague’s blog.

Our national interest in security, quality of life and economy based on electricity, and the growth from technology innovation will all benefit from success in energy storage. However, federal funding has lagged far behind that of other countries, and our own needs.

Policy decisions supporting energy storage of many kinds, in many states. Courtesy Energy Storage Association.

To give just one example: the US has not funded the 2014 DOE recommendations for an Energy Storage Safety Strategic Plan.

Meanwhile, the states push policies

Through initial regulatory approvals or bigger plans, states actively supporting adoption of energy storage include Alaska, Arizona, California, Florida, Hawaii, Indiana, Massachusetts, New Mexico, New York, North Carolina, Oregon, Texas, Vermont, and Virginaa. (There may be more, given the widespread interest in grabbing this opportunity.)

So how far have we come?

Batteries for the grid are still exceedingly rare. The first time a utility company in the US added a battery to its grid was in the late 1980’s. Two demonstrations were built in 1987. One was built by the co-operative electric company based in Statesville, North Carolina, which accepted a test battery from Public Service Electric & Gas in New Jersey. The other, much larger, was installed by Southern California Edison east of Los Angeles.

For comparison, the first nuclear power plant for a utility company started running in 1957, 30 years earlier.  That plant was built from a core that was intended for a Navy ship, a reminder of the national security interest in energy technology innovation.

To shorten power blackouts, strengthen our military bases, reduce our electric bills, and reduce pollution from power plants, we need federal government program commitments that are up to the opportunity and the challenge posed by our international rivals.

Spate of Nor’easters Rips Down Wires, Sparks Calls to Do Better

Photo: Western Area Power/CC BY (Flickr)

Over the first two weeks of March, three separate storms raged their way through my home state of Massachusetts. Each triggered life-threatening emergencies and what are certain to be costly, long-lasting cleanups.

They also resulted in massive and widespread power outages.

Thundering wind, crashing trees, and roiling floodwaters led to a significant number of homes and businesses getting thrust into the cold and dark; all told, each storm resulted in the loss of power for hundreds of thousands of customers across the state.

Two weeks, and three jarring reminders of just how dependent we are on the grid—and how vulnerable that grid is to failure.

And now, another storm is barreling on down the Pike.

Let’s make sure these aren’t suffered in vain.

Grid’s year in review

Our electricity grid is at once an incredible modern marvel and a staggeringly vulnerable piece of critical infrastructure.

And it’s not just been Nor’easters reminding us of that.

Indeed, this past year has been something of a master class in highlighting all the many ways the natural world can yank our electricity system to its knees, from flooding, to hurricanes, to wildfires, and more.

And it’s not just weather. Last week, the US government released an alert regarding Russian government cyber activity relating to energy and other critical infrastructure. Cyber threats are real, and growing.

At the same time, our society is rapidly tipping toward wholesale dependence on an interconnected world, one entirely reliant upon uninterrupted power. Which means that as incredible as these advances have been, it’s also increasingly true that everything stops when the power goes out.

So how do we make sure the lights stay on?

Complex problem, complex solu…zzzzzz

The challenges facing the grid are many, and there’s no one clean fix to solve them all. Worse, there’s no way that we’ll ever stop all power outages from occurring. Which all too often means that lawmakers and regulators find it’s easier to simply leave the problem alone.

But as the hundreds of thousands of homes and businesses across Massachusetts that lost power can attest, willful ignorance in the face of complex problems is an entirely unacceptable solution.

Here, a quick consideration of the multiple parallel vulnerabilities that exist along each part of the power system—vulnerabilities that must be overcome to make the grid more reliable and resilient throughout:

  • Generation. There’s no electricity without a power source, which means that ensuring our power plants can keep on generating is of first-order importance when trying to maintain our power supply. Threats facing generators are many and varied, though rarely result in customer outages: rising seas overtaking coastal sites, warming waters and droughts decreasing the reliability of thermal generators like coal and nuclear plants, an onslaught of cyber attacks looming over complex generator controls, and dependence on a system that’s predominantly reliant upon large central generators as opposed to a multitude of decentralized sources.
  • Transmission. To get electricity from power plants to end users, our system relies upon the transmission grid, a complex network of high-voltage wires that help convey electricity long distances. If these lines go down—whether from trees, fires, extreme weather, or mismanaged operational controls—large disruptions can result.
  • Distribution. Transmission lines bring electricity the majority of the way, but it’s the distribution system that actually delivers power to the end user. And here’s where so often the outages occur. From falling branches to floodwaters to squirrels and more, threats to the distribution network are many and varied.
  • Operations. Improving operations is perhaps most important of all, as it’s a near certainty that the power will go out. If utilities don’t have a plan to return power to the system as expeditiously as possible—while minding the particular and compounded threats facing vulnerable populations and critical services—power failures can quickly cascade into far worse disasters. An operations plan that is centered on system resilience enables rapid bounce-back in the face of inevitable blackouts.

One of the things that makes boosting grid resilience and reliability so challenging is that different areas of vulnerability require different solutions. Many, though, are rooted in the fundamental principle that strength flows through diversity, from generator sizes and types to network pathways and redundancies.

Some solutions are straight forward, like tree-trimming to keep snow-laden branches off power lines, and flood planning to keep critical assets out of the water. Others are more complex, like developing renewables-based microgrids to ensure critical services and vulnerable populations are powered up even if the broader grid goes down.

But the two things all solutions absolutely must have? A commitment to forward-looking perspectives, where climate impacts are considered over the full lifetime of infrastructure investments, and sustained diligence to see the solutions to a complex problem through.

Embracing the slog of incremental solutions

Slowly, we’re seeing the power of painful repetition to eventually, eventually activate the search for solutions.

In the aftermath of Superstorm Sandy, Massachusetts developed a $40 million initiative to bolster community electricity resilience projects served by clean energy technologies, alongside a series of additional resilience-supportive programs. Utilities have also been improving operations response plans reflecting learning from storms past. And finally, the state is also working to drive down its carbon emissions, staving off the worst of climate impacts, through clean energy commitments large and small.

And now, with shovels still scraping away at the mess of the last storm, Massachusetts Governor Charlie Baker has filed legislation that would authorize over $1.4 billion in investments to make the Commonwealth more resilient in the face of climate impacts.

It would be another critical step in the right direction. But still, it cannot be the last step.

We need to make sure not only that such a conversation keeps progressing in Massachusetts, but also that it takes place all across the country. As these Nor’easters have shown, there’s work to do to meet the challenges of today, let alone the rapidly evolving threats of tomorrow.

For long-lived infrastructure upon which we all so heavily rely, we need a system ready and able to face conditions now and in the future. And what’s more, we need leaders who are ready, willing, and able to do the hard work of steadily chipping away at solving a complex problem.

Photo: Western Area Power/CC BY (Flickr)

Automakers Turn to Climate Deniers in Quest to Lower Fuel Economy Regulations

Last month, the Alliance for Automobile Manufacturers submitted a report to the National Highway Traffic Safety Administration/Department of Transportation calling into question impacts of climate change and tailpipe pollutants in an effort to undercut the need for fuel economy regulation.  The Alliance is the trade group for Chrysler, Ford, General Motors, and Toyota, among others.  The report funded by the Alliance was written by industry shills with ties to the Heartland Institute and General Motors, and it flies in the face of automaker claims by the likes of Ford and Toyota that they are taking climate change seriously.

Taking a page straight out of the Disinformation Playbook

The group the Alliance funded to put together the report has a long history of working against environmental regulations—that’s pretty much their schtick.  Past clients include the American Petroleum Institute, the American Coal Council, the U.S. Chamber of Commerce, Monsanto, the American Enterprise Institute, and, of course, the Alliance.

The report follows a familiar pattern, generally calling into question the science behind the health impacts of [insert pollutant here], frequently based on a convoluted and biased modeling effort masquerading as science.

If you’re familiar with the Disinformation Playbook, then what’s in the Alliance’s paid-for report will sound familiar:

Automakers are running plays straight out of the Disinformation Playbook in order to try to weaken consumer and environmental protections.

  • “The Fake”—The papers cited to support weakening environmental protections are often paid for by industry and/or published in journals with weak peer-review standards and disclosure policies. For example, the Alliance report cites studies by Tony Cox which were directly funded by the American Petroleum Institute in order to cast doubt on the proven health impacts of soot.  Furthermore, the journals in which the articles were published are known homes for questionable industry-funded research, such as Regulatory Toxicology and Pharmacology.
  • “The Fix”—Two authors cited by the Alliance (Stanley Young and Tony Cox) are now in advisory roles for the EPA as part of the administration’s move toward soliciting their advice from industry-funded scientists. And the Alliance already has strong support within the Department of Transportation for its pitch—Deputy Secretary Jeff Rosen defended GM in liability litigation and fought the EPA’s regulation of carbon dioxide while at the Office of Management and Budget, and his previous employer (Kirkland and Ellis) was employed at least twice by the Alliance in suits to prevent California from regulating global warming emissions from vehicles.
  • “The Diversion”—Rather than summarizing the most recent body of research on climate impacts, as would be done by anyone genuinely interested in ensuring policy is based on the best science, the report cherry-picks studies to weaken the case for acting on climate and reducing emissions from vehicles, either by selecting outliers or misconstruing the findings of the research. For example, the Alliance selected “key points” from a paper on drought variation that seem to diminish the role of climate change on drought and flood, ignoring the paper’s other findings related to increasing temperatures and a “substantial intensification of the global hydrologic cycle [that] is likely in a warming world.” More recent studies citing this work build upon these ignored findings based on the most current data and find evidence for the increasing role of this temperature trend, including work by the authors of the cited study.
History repeating itself

Part of me is not surprised that the automakers have adopted this strategy to mislead on the science, because it’s a tactic they’ve used in the past time and time again, as I outlined in the UCS report Time for a U-turn:

These tactics are par for the course for the Alliance – our report Time for a U-Turn details more than six decades of industry interference with state and federal safeguards, encompassing all facets of protections: fuel economy, safety, and air quality.

  • 1950s: Automakers denied that smog was a problem and colluded to delay deployment of pollution controls in an effort to forestall regulation.  They also ran the same play on seatbelts, claiming that “nobody knows” if they save lives despite a decade of definitive research.
  • 1980s: Automakers claimed that there would be no health benefits for stronger pollution standards under the Clean Air Act.  This delay of course eventually led to state action and amendments to the Act in 1990 because of the nationwide problem with smog that finally even Congress couldn’t ignore.
  • 1990s: The lead voice for the automakers on revisions to “soot and smog” air quality requirements claimed that a temporary 20 to 30 percent reduction in lung function wasn’t a health effect.  It was also in the 1990s that automakers began their climate skepticism, claiming that climate models were too uncertain to act upon.  Chrysler CEO Robert Eaton even penned a Washington Post op-ed opposing ratification of the Kyoto Protocol, claiming action on climate was “unwise and unnecessary.”

In our report, we called for the industry to make a U-turn on its behavior, but the sad fact of the matter is that the auto industry does not seem to want to change, and this most recent submission to the government shows us just how much they are still mired in their questionable tactics of old.

It’s time for automakers to lead

The vehicle efficiency standards set back in 2012 were the result of support from the automakers, who worked closely with regulators to design the regulations.  For an industry with a decades-long history of fighting regulation, this about-face was the result of a harsh confrontation with reality.

Just before the Great Recession, the Detroit Three had fallen on hard times as a result of neglecting to invest in the efficiency of their passenger car fleet.  When gas prices rose, the companies’ sales and profits dropped like a rock, requiring massive loans.  In this context, the industry seemed to finally recognize the value in industry-wide efficiency standards that hadn’t been raised in decades.

Today, however, that leadership is nowhere to be found.  Automakers are urging he current administration to weaken standards, and Pruitt’s EPA and Chao’s DOT seem dead-set to do exactly that.  While suppliers recognize the harm that pulling back on these standards will do to the industry, automakers are stuck in the same mindset that cost the country, and our environment, so deeply in the past.

Now, not only are automakers trying to weaken the standards—they are calling into question the need for regulations at all.  NHTSA would do well to ignore this rubbish in order to make sure its decisions are best on the best available science, but if automakers like Ford and Toyota truly think that climate change needs to be addressed, then it is incumbent upon them to keep their trade association from putting out this kind of anti-science drivel.

Automakers need to stand up for science and call out this nonsense, or they stand complicit on the side of rhetoric and lies in weakening our environmental protections just to pad their profits.


Unseasonably Warm Arctic Winter is Thawing Alaska and May Be Linked to Nor’easters

Winter just isn’t the same these days in the North Pole region.  At a time of the year when we expect to see maximum Arctic sea ice area for the winter season, all indications are that a 2018 winter maximum area will rank among the lowest years (#1: 2017, #2: 2015, #3 2016). Nick Mailloux calculated that the historical average for maximum Arctic sea ice area is slightly larger than twice the size of the contiguous United States, but in 2017 the area lost was more than the size of California, Nevada, Utah, and Arizona combined. The northernmost city in the U.S. – Utqiaġvik (formerly called Barrow), Alaska – had record warm temperatures this winter.  Perhaps the biggest shocker though, was  the North Pole went above freezing this winter (20 to 30 degrees Celsius or 36 to 54 degrees Fahrenheit above average).  Records are being broken across the Arctic winter of 2017/2018 consistent with a larger trend.

Zonal mean 1880-2017 temperature change

Figure 1: The Arctic zone is warming more than twice the global average temperature. Source: NASA GISS

A warmer arctic is part of a trend since 1990 that scientists refer to as “Arctic Amplification.” This refers to the amplified regional response to global warming.  The red colors in the NASA GISS plot for zonal average temperature change over time, indicate that the Arctic zone has warmed more than twice the rate of the global average temperature rise (see Figure 1).  Two new studies add to the mounting evidence regarding correlations with Arctic Amplification and changing severity of weather patterns in North America and Eurasia.  Time to check in on the implications of these studies as we’re witnessing the extraordinary string of Nor’easters pounding New England this winter season.

Warmer Arctic correlated with more frequent cold outbreaks elsewhere in winter

The Arctic used to be so cold during the winter it was as if a fence surrounded it, keeping the coldest air trapped within the North pole region.  A weak spot in the fence could, on rare occasions in the past, lead to cold outbreaks southward into the continental U.S. or Eurasia–these spots are becoming more frequent.  The fence can break the other way, causing the warmer air from the south to also penetrate further north.  Hence the many records mentioned above are being broken across the Arctic this winter while parts of the continental U.S. and Eurasia log colder than expected temperatures. This “fence” in the high atmosphere is called the stratospheric polar vortex.  Unlike a fence that takes time to be repaired and designates a fixed boundary, the stratospheric polar vortex is dynamic.  Therefore, the boundary position changes over time and interacts with other dynamic parts of the atmosphere and ocean.

Science suggests the behavior of the polar vortex is changing with a warming climate, which is starting to tip the scales, but in both directions. During any given winter, parts of mid-latitude North America and Eurasia experience periods of cold and warm weather, but in recent years the differences between them are typically greater – the colds are colder and the warm periods are warmer.

Building upon many studies over recent years, Kretschmer and colleagues correlated when the stratospheric polar vortex was weak (when the fence breaks), northern Eurasia and Canada were colder than normal.  A strong stratospheric polar vortex (when the fence holds) was associated with warmer temperatures in northern Eurasia and the eastern United States and colder temperatures over Alaska and Greenland. Their study published in January advances our understanding of a winter (January through February) pattern observed from 1990 through 2015.  Furthermore, the researchers suggest that seasonal forecasts could be greatly improved by paying attention to the conditions that are observed before a weakened polar vortex event occurs in the winter.

What does this mean for the 2018 Nor’easter season? Land & Ocean Temp Departures Jan 2018 NOAA

Figure 2: Strong 2017/2018 winter and spring Nor’easter season for New England (i.e. colder than normal eastern U.S. surface temperatures and warmer than normal ocean waters off the U.S. east coast).

A recent study by Cohen, Pfeiffer and Francis found that the time of Arctic Amplification (1990-2015) correlates with increased occurrence of heavy snowfall in the northeastern US, but decreased occurrence in the western US, compared with the time period before (1950-1989).  Research is ongoing to better understand the physical mechanisms for these observations.  For now, we can examine the wacky weather that helped create ideal conditions for the ‘Nor’easter’ storms happening in rapid succession over New England this season.

Northeastern storms typically occur when there is a seasonal difference between the relatively warmer ocean compared to the adjacent colder land. First condition was met – a warmer than normal ocean temperatures off the U.S. east coast for this winter season (see Figure 2).  The second condition was also met – the eastern U.S. in January 2018 was colder than normal for this time of year.  Not surprising since there was record-breaking sea ice loss this winter coupled with other strong signals of Arctic warming as well as a polar vortex so weak it split in two.

What does this mean for Alaska and other high northern latitude regions?

Recall that when the stratospheric polar vortex is weak, not only can cold outbreaks penetrate further south, but warm air from the south can also penetrate further north than would be typical in the past.  These warm air incursions into the high north is part of why record warmth occurred this winter season in northern Alaska, northern Greenland, and the North Pole (remarkably above freezing).  Alaska had its fourth warmest December through February on record (Figure 3).  My colleague, Tosin Fadeyi, pointed to some grave consequences for Alaskans grappling with warmer than expected conditions.  Alaskans who depend heavily on subsistence-based hunting face severe challenges from lack of ice and mild winter weather. There have been so many holes in the ice highway along the Kuskokwim River this winter, the community ran out of reflective tape to alert of the dangers. At the end of January a family fell into a hole in the Kuskokwim river ice highway; five survived and tragically one perished.  This warm winter season was preceded by unseasonably warm autumn.  The northernmost town of Utqiagvik (formerly Barrow) historically depended on sea ice to protect it from the autumn storms.  Sea ice can dampen the ability of a storm to generate waves.   Little or no ice can expose the coastal community.   Utqiagvik sustained around $10 million in damage from a storm at the end of September.

Temp time series Alaska

Figure 3: Alaska is warming up. Alaska Climate Research Center UAF

Families in New England and Alaska Bearing the Costs of this Winter Season

Perhaps the residents confronting a string of Nor’easters have more in common with Alaska residents than at first glance. Some have compared these Arctic cold outbreaks to a freezer door being left open for a period – frigid air escapes and chills the kitchen and simultaneously the warm air from the room fills the freezer.   More than two million in 13 states in the northeast have suffered power loss after just one nor’easter this season; in Massachusetts, three times in two weeks a storm has resulted in hundreds of thousands of homes and businesses being left without power.

Families suffering multiple days without power may have to throw out an entire freezer and refrigerator-load of food and may seek warm shelter.  Alaska residents who don’t want to risk travelling on ice highways with dangerous holes may not reach traditional hunting grounds to feed their families and their communities.  Families in both Alaska and Eastern U.S. regions are now bearing the costs of the disruptions to activities that used to be sufficiently adapted to the seasons of the past.  Now ice highways can have dangerous holes and the infrastructure supplying power to homes may not be up to the task of withstanding a string of intense storms.

More and more communities are taking note, discussion tradeoffs, and finding creative solutions for more resilient communities.  For the sake of our families, friends and neighbors.


NASA GISS NOAA NCEI Alaska Climate Research Center, Geophysical Institute – University of Alaska Fairbanks

Rep. Lamar Smith Misunderstands Science

Photo: Ryan J. Reilly

Congressman Lamar Smith (R-TX) states that as Chairman of the House Committee on Science, Space, and Technology he seeks facts about climate change, and that his Committee follows “the scientific method.” These are welcome and vitally important positions for a powerful Congressman to take on a topic of such vital national interest. It is essential that scientific evidence be the foundation for legislative action about climate change.

Unfortunately, in his article Mr. Smith does not seek facts or apply the scientific method. Instead, he makes claims that are contrary to established facts, and provides no evidence or analysis to support his assertions as the scientific method requires.

For example, Smith claims “United Nations Intergovernmental Panel on Climate Change has affirmed that they have “low confidence” in climate change contributing to extreme weather.” Actually, the IPCC stated that “a changing climate leads to changes in the frequency, intensity, spatial extent, duration and timing of extreme weather and climate events, and can result in unprecedented extreme weather and climate events.”

Smith notes that “U.S. wildland fires are decreasing in frequency,” which is a trend under investigation using the scientific method of proposing and testing alternate hypotheses. For example, the decrease could represent an impact of a change in climate, but it could easily be the result of successful fire prevention strategies. But Mr. Smith does not consider alternate hypotheses as the scientific method requires. Instead, he concludes that reduced fire frequency proves climate change does not increase the frequency of such extreme events.

Meanwhile, Mr. Smith completely ignores the data demonstrating the total acres burned in wildfires is going up, which is the fact that constitutes the major threat to people around the world. The most recent assessment for the US states, “The incidence of large forest fires in the western United States and Alaska has increased since the early 1980s and is projected to further increase in those regions as the climate changes, with profound changes to regional ecosystems.” Ignoring data is a luxury that only politicians can indulge in, as any scientist who does won’t get manuscripts through peer review.

This is reminiscent of when Mr. Smith accused NOAA scientists of “altering the data,” calling their published scientific analyses of atmospheric temperatures “skewed and biased,” a claim made with no accompanying analysis or evidence. In fact, NOAA scientists were using the scientific method to identify the bias that exists in temperature measuring instruments and making their data more accurate by taking this bias into account. We all apply this same process when we compare the results of different bathroom scales, time pieces, meat thermometers, or fuel gauges in cars. This is an example of Mr. Smith practicing intimidation, not science, as noted by the American Meteorological Society.

Smith claims in his article that he is called a “climate denier” because he “questions assertions,” which again demonstrates a misunderstanding of the scientific method. The reason Mr. Smith is called a “climate denier” is because he questions scientific conclusions without providing an alternate explanation for existing observations. A true skeptic would propose an alternative, testable hypothesis for observations.

For example, our release of greenhouse gases has raised the average temperature of the ocean by a little over half a degree Fahrenheit, which represents an amount of energy (1023 joules) that is 10 billion times the amount of energy (1013 joules) released by the atomic bomb that destroyed Hiroshima.

A true skeptic of global warming must propose an alternative explanation for how all of this energy has accumulated if greenhouse gases are not responsible. Skeptics have suggested changes in the sun’s energy output as an alternate explanation, but it is now well established that the sun’s energy output has actually been declining over the last few decades. In fact, the brightness of the Sun is at a record low right now.  True skeptics would also propose an explanation for why greenhouse gases are not causing the earth to heat up, given that we know these gases trap heat.

Mr. Smith cannot offer such explanations because they do not exist. Instead, as we see from the examples above, he attempts to misrepresent or ignore existing evidence. This is deeply unfortunate given Mr. Smith’s position in Congress. Those who seek nonpartisan, evidence-based policies to address the impacts of climate change must demand that their representatives based their positions on facts supported by the scientific method.

While Smith states that “climate alarmists just won’t let the facts get in the way of their science fiction,” analyzing his own claims demonstrate that the fiction is being propagated by Mr. Smith.


Andrew Gunther is executive director of the Center for Ecosystem Management and Restoration, and a board member at the Union of Concerned Scientists. He has published research in the field of ecotoxicology and has extensive experience in applying science to the development of air, water, and endangered species policy. Dr. Gunther served as the assistant chief scientist for the Exxon Valdez Oil Spill Restoration Program from 1991 to 2002, and is currently the executive coordinator of the Bay Area Ecosystems Climate Change Consortium.

Black Lung, Abandoned Mines, Struggling Communities—And No Leadership

A miner waiting for a black lung screening. Photo: National Institute for Occupational Safety and Health

My grandfather was the son of Italian immigrants—many of whom settled in north central West Virginia to work in the coal mines. He worked hard his whole life and built a better life for himself and our family. According to family legend, he famously told my grandmother early in their courtship, “Stick with me, and you’ll wear diamonds.” She did.

My grandfather died of black lung disease in 1988.

Thirty years later, there’s no way that other families should be going through what mine and so many others have. And yet today the disease is making a strong and frightening resurgence. How is black lung related to economic development and mine reclamation? It turns out Congress has an opportunity to address all three by passing the RECLAIM Act—but only if leaders don’t take their eyes off the ball.

What is black lung disease?

The effect of black lung disease. Photo: LeRoy Woodson, Wikimedia

Coal worker’s pneumoconiosis—known as black lung disease or simply black lung—results from long-term exposure to coal dust. The small particles build up in the lungs over time, since the body can’t expel them, leading to inflammation, fibrosis (the buildup of excess connective tissue), and in the worst-case scenario, necrosis (cellular death). Black lung is similar to other forms of lung disease caused by exposure to silica dust. The early stage of the disease is called simple black lung, while the later stage (and more debilitating) form is called complicated black lung.

As the disease progresses, it quite literally becomes harder and harder to breathe. The disease has no cure (short of a lung transplant, only available for miners healthy enough to qualify).

Black lung is, however, entirely preventable—simply by avoiding inhalation of coal dust.

The Obama administration developed stricter rules to lower the level of respirable dust, and to protect coal miners from the health dangers of exposure. Coal companies fought the rule tooth and nail, and lost in court.

But now the current administration is “reevaluating” the rule meant to protect miners’ health as part of its anti-regulatory agenda, although the current head of the Mine Safety and Health Administration claims that the agency has “no immediate plans” to weaken the rule.

“Mining disasters get monuments. Black lung deaths get tombstones.”

In my grandfather’s time underground, the causes of black lung weren’t well understood. But now we know how to prevent the disease—by reducing exposure to coal dust.

And yet, now, near the end of the second decade of the 21st century, the disease is actually on the rise. The National Institute for Occupational Safety and Health (NIOSH) has uncovered the largest cluster of complicated black lung cases ever reported: 416 cases reported in three central Appalachian clinics from 2013 to 2017. The study followed an investigation by NPR last year that found 963 cases from 11 clinics since the beginning of the decade. In NPR’s ongoing coverage, some 2,000 cases have been documented, far more than government statistics.

Even more alarming, the disease seems to be affecting younger miners, in their 50s, 40s, and even 30s. Why? Epidemiologists have linked the new wave of black lung cases to breathing in more silica dust, likely the result of a long-term shift to mining thinner seams of coal. Getting to these thinner seams requires cutting into surrounding rock—creating silica dust that is also breathed by miners.

The human cost of this disease is almost immeasurable. Have a listen to NPR’s audio report on the NIOSH study, which features several miners suffering with the disease. Local officials call this cluster a public health emergency. As one clinic director notes (also in the NPR story):

Mining disasters get monuments. Black lung deaths get tombstones. And I’ve seen many a tombstone in 28 years from black lung. And I’m seeing more now. A lot more now.

Federal support

According to the Department of Labor, 76,000 miners have died of black lung since 1968. To support miners and families when the coal company can’t be identified or is no longer in business, in 1977 Congress set up the Black Lung Disability Trust Fund, which has so far shelled out $45 billion in compensation to miners and their families. (When the coal company responsible for a miner’s disability can be identified, it often takes many years for miners to receive compensation, because the company can hire expensive lawyers and its own doctors to dispute the diagnosis, creating an endless backlog of red tape and bureaucracy.)

Payments from the trust fund go to present and former coal miners in part for medical payments arising from disability from working in the mines. The fund also provides monthly payments to disabled miners and their surviving dependents. My grandmother received those payments after my grandfather passed away.

The money for the fund comes from a per ton excise tax on coal, paid by coal companies.

That tax, though, is set to revert to low, 1977 levels at the end of 2018. The Government Accountability Office is currently studying how this reduction would affect the solvency of the Trust Fund. With all the coal companies going bankrupt in the last few years, there may well be a funding crisis on hand for the Black Lung Disability Trust Fund.

Congressional action?

What does all this have to do with abandoned coal mines and economic development?

As I’ve written, the RECLAIM Act, H.R.1731, would release $1 billion over 5 years to clean up and repurpose long-abandoned coal mines. The House version would prioritize projects that spur local economic development. This would represent a win-win for coal communities suffering from the downturn in the industry. Rep. Hal Rogers (R-KY) has championed this legislation in Congress.

Even though RECLAIM would release money from the Abandoned Mine Land Fund, it still represents a payment out of the Treasury; because of budgeting rules, the legislation requires a “pay-for”, meaning adding new revenue or new cuts to offset the payments. Rep. Rogers worked with his colleagues and proposed extending the coal excise tax for the Black Lung Disability Trust Fund at current levels for an additional 10 years.

With that change, the bill now represents a win-win-win—ensuring the continuation of much needed medical payments and compensation for miners while also cleaning up abandoned mine sites by funding projects that simultaneously spur local economic and community development.

Opposition from the usual suspects

Rogers and his supporters are working to attach the bill to the Omnibus spending bill for Fiscal Year 2018, which must be completed by March 23 to keep the government open.

Coal mining companies and their national trade association the National Mining Association (NMA), though, hate the RECLAIM Act and are working hard to kill it. All of their lobbying in the name of reducing taxes that would pay for the mess they’ve made, in terms of both environmental destruction and human suffering.

Where’s the leadership?

US House leadership is currently putting together its omnibus spending bill for FY 2018. Is RECLAIM on their minds?

Senate Majority Leader Mitch McConnell, who hails from Kentucky, says that he supports RECLAIM, but actions speak louder than words. Will he acknowledge the broad public support for RECLAIM by his constituents?

As a first generation American, my grandfather was proud to pay his taxes. Coal companies should be too. It’s unconscionable—and sadly, unsurprising—that coal companies continue to put profits over people.

National Institute for Occupational Safety and Health (NIOSH)