Combined UCS Blogs

New Tax on Graduate Students Would Harm the US

UCS Blog - The Equation (text only) -

A graduate student demonstrates how her tax burden would increase by nearly $10,000 if the House version of the Tax Cuts and Jobs Act became law. Photo: Amanda Rose

On November 16, the House of Representatives and the Senate Finance committee voted to advance tax reform legislation. These bills, both of which are named the “Tax Cut and Jobs Act,” propose to disproportionately and negatively impact the middle class, threaten to leave millions of Americans without health coverage, would add as much as $1.5 trillion to the deficit, and could burden graduate students with a giant tax hike.

Many graduate students have taken to social media to demonstrate how their tax burden would change if the House version of the Tax Cuts and Jobs Act became law. This picture and calculation were made publicly available via the Facebook page of Amanda Rose, a graduate student at Columbia University in New York City, NY.

The version passed by the House of Representatives includes a new tax provision that would require students to pay tax on the value of the tuition that is waived for graduate student research and teaching assistants. Given the low pay of such positions, this would make it nearly impossible to pay cost of living expenses while attending graduate school. As a former graduate student myself, and having been a teaching and research assistant, I understand how critical every dollar of a stipend is to purchase groceries, pay rent, and maybe even take care of your own health (if you can afford it).

The Tax Cut and Jobs Act is an attack on higher education in more ways than one. It also proposes to repeal the student loan interest reduction, graduate student tuition waivers, the Hope Scholarship Credit, the Lifetime Learning Credit, and other educational assistance programs. But it isn’t just graduate students who will feel the consequences; such moves stand to affect us all.

Science is linked to economic prosperity

Investment in science is investment in our nation. Many international comparisons still place the US as a leader in applying research and innovation to improve the country’s economic performance. A prior review by the Organization for Economic Co-operation and Development (OECD) concluded that since World War II, United States leadership in science and engineering has driven its dominant strategic position, economic advantages, and quality of life. Indeed, researchers have long understood that there is a link between economic prosperity and investment in science and technology.

The leadership of the United States in science explains, in part, why the country is ranked as one of the most economically competitive nations in the world. Across a number of metrics, the United States is still the undisputed leader in basic and applied research.

Researchers in the United States lead the world in the volume of research articles published, as well as the number of times these articles are cited by others. The United States is not just producing a lot of raw science, it also is applying this research and innovation, as other metrics show.

The United States has a substantial and sustainable research program, as evidenced by the number of Ph.D. students trained; it invests heavily in research, as shown by the country’s gross domestic expenditure on research and development; and it is a leader at turning science into technology, as evidenced by the high number of patents issued.

Graduate students are critical to US science and innovation

If the production of science has helped the United States economy remain competitive, graduate students are largely to thank. They are pivotal to the production of novel science and innovation in the US, and they are also the professors, inventors, and innovators of the future that our economy depends on.

The Tax Cut and Jobs Act would make it difficult, if not impossible, for many of the brightest minds in America to enter into science, technology, engineering, and mathematics (STEM) fields, ultimately decreasing America’s international competitiveness in science and technology.

A provision in the Tax Cut and Jobs Act passed by the House of Representatives would tax graduate students on their tuition costs. This would reform the Internal Revenue Service tax code, section 115(d), which allows universities to waive the cost and taxation of tuition for graduate students who conduct research or teach undergraduate classes at approved universities.

An estimated 145,000 graduate students benefit from this reduction with 60 percent of these students in STEM fields. Thank goodness such provisions exist for tuition waivers and scholarships as even some of our senators likely wouldn’t be where they are today without this benefit in our tax code.

If graduate students were taxed on waived tuition, many who serve as research or teaching assistants would find it more difficult to cover basic living expenses with the stipend they receive. For example, a graduate student at Columbia University might receive $38,000 for a stipend and a tuition waiver for $51,000. Currently, they pay $3,726 in taxes, but that could go up to $13,413 under the House’s proposed legislation reducing their monthly take home pay for food, rent, and health from $2885 to $2078.

Some students have reported that they would see their stipends cut from $27,000 to $19,000, or from $13,000 to $8,000 for the year if the House’s tax reform bill became law. While some students may be able to depend on their families to defray the costs of these taxes, many graduate students who come from poor and middle class backgrounds could not. As the majority of Americans who come from poorer backgrounds are also minorities, this would deter diversity in higher education, where we already know it is sorely needed.

Some universities could cover tuition and the tax on that tuition for some students, but they wouldn’t be able to do it for all. Taxation of tuition waivers also would likely make the US less attractive to international students, many of whom are graduate students in STEM. Ultimately, this regressive tax legislation means fewer graduate students at universities and, therefore, decreased research in the United States.

An anti-science message is in the air

If you are surprised that graduate students are being targeted, you are not alone. Many organizations who support the higher education community have signed on to letters and published statements expressing concerns for graduate students, including the American Council on Education, the Association of Public and Land Grant Universities, the Association of American Universities, the American Association for the Advancement of Science, and the Union of Concerned Scientists.

It is unclear if a final version of a tax reform bill will include provisions that burden graduate students with enormous tax hikes. While the Senate’s version of a tax reform bill would retain many of the tax benefits for undergraduate and graduate students (including a non-taxable tuition waiver), it still includes many provisions opposed by organizations supporting higher education.

Regardless of what tax reform bill is pushed through, there is still the question of why the House targeted graduate students in the first place? Is it because they are an easy target having little representation on the hill? Is it because this would be one way to dismantle the pipeline of those pesky academics?

These are Americans who work hard to teach and produce transformative research that greatly benefits the United States economy–and they already do this for very little pay. Furthermore, the amount of money that the government would gain from these taxes has been said to be “miniscule” compared to trillions of dollars in national debt. It is absurd that graduate students are being targeted.

Speak up for all scientists now and in the future!

I’m a former graduate student and I would not have been able to afford graduate school if I had to pay tax on my graduate student tuition and certainly wouldn’t be where I am today without this benefit in our tax code. That’s why I’m speaking up for all the early career scientists now and in the future–everybody deserves the same opportunities that I had, and the United States deserves the continued prosperity that science affords it.

Call your senators today at 1-833-216-1727 and urge them to vote ‘no’ to the Tax Cut and Jobs Act.

The full Senate will vote on this bill after Thanksgiving. Learn more about the current tax reform legislation and how you can push back.

Always in “Hot Water”

UCS Blog - The Equation (text only) -

My wife likes to joke that I am always in “hot water.” It’s a play on words that reflects my career from college, at two National Laboratories and now in retirement.

America’s National Laboratories are hotbeds of scientific research directed at meeting national needs. In my case, working at two national labs helped me contribute to resolving growing issues of environmental impacts of energy technologies—thermal electric generating stations, in particular on aquatic life of rivers, lakes and coastal waters.

Getting a PhD in 1965, I was recruited by the Atomic Energy Commission’s (AEC’s) Hanford Laboratory (now the Pacific Northwest National Laboratory of the US Department of Energy) to conduct research on thermal discharges to the Columbia River from nine Hanford, Washington, plutonium-producing nuclear reactors. They were part of cold-war nuclear weapons production, but their thermal discharges were not unlike those from a power plant, just larger.

With pretty good understanding of potential water-temperature effects on aquatic organisms, our team of researchers sought effects of elevated temperatures on various salmon populations and the river’s other aquatic life. We had two main objectives: (1) to identify effects of the Hanford reactors on the river’s life, and (2) to translate our findings into criteria for safely managing thermal discharges (like the 90-degree limit for damages I found for Delaware River invertebrates).

Our Hanford research caught the attention of AEC headquarters and its Oak Ridge National Laboratory in Tennessee. There was interest in countering the public thermal pollution fears by doing research that could be applied to minimizing ecological impacts everywhere. Thus, in the fall of 1969, I was asked to leave Hanford, which I greatly enjoyed (as a Northeasterner, the Pacific Northwest was like a paid vacation!) and moved to Oak Ridge in spring of 1970.

At Oak Ridge, I put together a team to develop criteria for minimizing ecological effects of thermal effluents nation-wide.  Oak Ridge had no power plants of its own. Tennessee Valley Authority (TVA) power stations nearby were research sites, but our focus was on developing general criteria. We built a new Aquatic Ecology Laboratory with computer-controlled tank temperatures, a set of outside ponds to rear fish for experiments, hired biologists and engineers, and assembled a “navy” of boats for field work. We set to work at a fever pitch.

But then…. The Congress passed the National Environmental Policy Act (NEPA), and the AEC was handed the Calvert Cliffs decision that mandated the AEC conduct complete reviews of the environmental impacts of the nuclear power stations it licensed. In 1972, our research staff was “reprogrammed” to prepare Environmental Impact Statements on operating and planned nuclear power plants. This turned out to be a tremendous opportunity to carefully evaluate not only thermal discharges but other impacts of using cooling water. By evaluating facilities across the country, we gained the nationwide perspective we needed for our research. With the National Lab having staff from many scientific and engineering fields to assign to the assessments, we gained a hugely valuable multi-disciplinary perspective that has helped us advance beyond just biology, fish and bugs.

Many years of productive thermal-effects work followed, with satisfaction that our contributions were often followed and our data used. We saw many of our efforts resolve issues for power plant thermal discharge permitting. The National Academies used our framework for water quality criteria for temperature; EPA used them as criteria for “Balanced Indigenous Communities” in thermally affected waters and setting temperature limits. As “thermal pollution” became more resolved, the Department of Energy and our National Laboratory provided our scientists the mission and capacity to work on other issues, most notably aquatic ecological effects of hydropower, that is helping with future innovation as technologies shift.

Throughout our research and analysis, we fostered “technology transfer” to the public through educational seminars and information aid to electricity generators. ORNL sanctioned some outside, site-specific consulting. I have been fortunate in retirement (since 2005) to continue to do this, and have assisted more than 50 companies and regulatory agencies (both domestic and foreign) with thermal effects issues. I feel good that the problem-solving research and analysis and application of this knowledge outside the labs (my “hot water”) have benefited society.

Through my time at the Hanford/Pacific Northwest and Oak Ridge national labs, I’ve worked with world-class researchers and scientists in many disciplines and have worked on projects that have advanced our understanding of ecological impacts from various energy sources. We need to continue to invest in our scientists at federal laboratories of the Department of Energy. I would like to thank my fellow scientists at government labs this Thanksgiving for the work they’ve done problem solving and finding innovative solutions for the public as well as private sector.

Dr. Charles Coutant retired as distinguished research ecologist in the Environmental Sciences Division of Oak Ridge National Laboratory in 2005. Dr. Coutant received his B.A., M.S., and Ph.D. in biology (ecology) from Lehigh University.  Since retirement he has served part time as an ecological consultant to regulatory agencies and industry.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Thanksgiving Dinner Is Cheapest in Years, But Are Family Farms Paying the Price?

UCS Blog - The Equation (text only) -

Last week, the Farm Bureau released the results of its annual price survey on the cost of a typical Thanksgiving dinner. The grand total for a “feast” for 10 people, according to this year’s shoppers? About 50 dollars. ($49.87, if you want to be exact.) That includes a 16-pound turkey at $1.40 per pound, and a good number of your favorite sides: stuffing, sweet potatoes, rolls with butter, peas, cranberries, a veggie tray, pumpkin pie with whipped cream, and coffee and milk.

After adjusting for inflation, the Farm Bureau concluded that the cost of Thanksgiving dinner was at its lowest level since 2013. Let’s talk about what that means for farmers, and for all of us.

We can debate whether the Farm Bureau’s survey captures the true cost of a holiday meal for most Americans. This isn’t the world’s most technical survey—it was based on 141 volunteer shoppers at 39 grocery stores across the country purchasing these items at the best prices they could find.

But according to the USDA’s Economic Research Service, Americans do spend less than 10 percent of their disposable personal incomes on food. ERS data also shows that farmers receive just 16 cents for every dollar of food consumers purchase. (Speaking of historic lows, that’s the lowest farmer share of the food dollar in over a decade.) The rest of it is distributed throughout the food supply chain, which includes the companies that process, package, transport, and sell these foods at any number of retail outlets.

For our hypothetical holiday dinner for 10 (including leftovers), this means that in total, the farms that produced the raw foods, from potatoes to pumpkins, made about eight dollars. That’s eight dollars total across all farms, which then must pay workers’ wages and cover operating costs. These margins can work for large-scale industrial farming operations, due in part to heavy reliance on and exploitation of undocumented agricultural workers, but the math doesn’t add up for most family farms and farm workers.

And despite the savings we enjoy as consumers, the reality is that the prevailing model of food production isn’t good for any of us—least of all rural farming communities.

Midsize farms and missed opportunities

Midsize family farms, generally defined by the USDA as those with a gross cash farm income between $350,000 and $1 million, have long been key drivers of rural economies. But since 2007, more than 56,000 midsize farms have disappeared from the American landscape—a trend that has had serious consequences for rural communities across the country.

These farms employ more people per acre than large industrial farms, and when they disappear, they take both farming and community jobs with them. Midsize farms are also more likely to purchase their inputs locally, keeping more money in the local economy. Research has shown that areas containing more midsize farms have lower unemployment rates, higher average household incomes, and greater socioeconomic stability than areas having larger farms.

Beyond their impact on local economies, midsize family-owned farms are more likely than large industrial farms to use more environmentally sustainable practices such as crop rotation and integrated livestock management, resulting in greater crop diversity. This, too, may have health implications: in a country in which about half of all available vegetables and legumes are either tomatoes or potatoes, with lettuce bringing home the bronze, it stands to reason that greater diversity in our food supply can only be a good thing.

So if midsize farms are so great… why are they disappearing, and what can we do to reverse the trend and revitalize rural farming communities?

 US Department of Agriculture/public domain (BY CC0)

The Local Food and Regional Market Supply (Local FARMS) Act

Representatives Chellie Pingree (D-ME), Jeff Fortenberry (R-NE), and Sean Maloney (D-NY) and Senator Sherrod Brown (D-OH) recently offered their answer with a set of proposed policies and programs they want included in the 2018 farm bill. The Local Food and Regional Market Supply (Local FARMS) Act of 2017 would make new investments in local and regional food systems, helping small and midsize farmers connect with more consumers. It would ease the way for institutions like schools to purchase locally produced food, and would make fresh, healthy foods more accessible and affordable for low-income families.

In short, the Local FARMS Act is a win-win for farmers and eaters.

Leveraging consumer demand for local and regional foods and the substantial economic opportunity provided to midsize farmers by institutional food purchasers, this bill shortens the distance between producer and consumer. That ensures that a greater share of the food dollar ends up in farmers’ pockets—and that more fresh, healthy foods get to the people that need them.

Some of the key programs and provisions include:

  • The new Agricultural Market Development Program, which streamlines and consolidates local food programs to provide a coordinated approach to strengthen regional food supply chains. This program includes:
  • A Food Safety Certification Cost-share Program that allows farmers to share the cost of obtaining food safety certifications, which are required by many institutional purchasers but often prove cost-prohibitive for small and midsize producers—many of whom already have good food safety practices in place.
  • An amendment to the Richard B. Russell National School Lunch Act that allows schools to use locale as a product specification when soliciting bids, making it easier to procure local foods.
  • A Harvesting Health Pilot authorizing a pilot produce prescription program that would enable healthcare providers to offer nutrition education and fresh fruit and vegetable coupons to low-income patients.

By providing the infrastructure and support needed to bridge critical gaps between local producers and consumers, the proposed policies and programs contained in the Local FARMS Act lay the groundwork for stronger regional food systems, more vibrant local economies, and a healthier food supply.

Let’s give thanks and get to it

Whatever table you might gather around this Thursday, in whosever company you might enjoy, save some gratitude for the folks who put the food on your plate. And when you’re done enjoying your meal, let’s get to work take a nap. And when you’re done taking a nap, let’s get to work. If we want a financially viable alternative to industrial food production systems, it’s up to all of us to use our voices, our votes, and our dollars to start investing in one.

Stay tuned for action alerts from UCS on how you can help strengthen our regional food systems and support our local farmers through the 2018 farm bill. For updates, urgent actions, and breaking news at your fingertips, use your cell phone to text “food justice” to 662266.

Climate change is here. Can California’s infrastructure handle it?

UCS Blog - The Equation (text only) -

Wildfires across the West threaten critical infrastructure. Photo: Tim Williams. CC-BY-2.0 (Wikimedia)

This has been a year of extremes in California. We’ve experienced all-time temperature highs (statewide and regionally), a deadly heat wave, the most destructive and lethal wildfires in the state’s history, and the second wettest winter on record following a historic five year drought. The impacts have been staggering: many lives lost, thousands of properties destroyed, and costly infrastructure damage.

We know that extreme weather events will become more common and intense as a result of climate change. Such events multiply threats to infrastructure across the state, endangering community well-being, public health and safety, and the economy. A new white paper released by UCS today – Built to Last: Challenges and Opportunities for Climate-Smart Infrastructure in California –makes the case for investing limited public resources in infrastructure that can withstand climate change impacts and keep Californians safe.

A better path forward

Extreme weather-related infrastructure disruptions in recent years – from power losses and train derailments to bridge and spillway failures, road closures, and low water supplies – provide us with a sobering preview of the future challenges facing California’s infrastructure systems. (See this map for other recent examples.) The type, frequency, and severity of these climate-related hazards will vary by location, but no region of California or infrastructure type will be left untouched.

While the state of our dams, pipes, levees, bridges, and roads is mediocre at best (they received a combined C- on ASCE’s 2012 report card), the need to upgrade or replace our water, power, and transportation systems is a golden opportunity to plan, design, and build these systems with climate resilience in mind. The UCS white paper describes a set of principles for ‘climate-smart’ infrastructure and then highlights barriers and opportunities for improving and accelerating their integration into public infrastructure decisions.

What is climate-smart infrastructure?

Climate-smart infrastructure is designed and built with future climate projections in mind, rather than relying on historic data that are no longer a good predictor of our climate future. It bolsters the resilience of the Golden State’s communities and economy to the impacts of extreme weather and climate change instead of leaving communities high and dry, overheated, or underwater.

A microgrid is providing efficient, reliable, cleaner power for Blue Lake Rancheria government offices, buildings, and other critical infrastructure, such as an American Red Cross disaster shelter. It will also create local jobs and bring energy cost savings. Photo: Blue Lake Rancheria

Climate-smart also can reduce heat-trapping emissions, spend limited public funds wisely, and prioritize equitable infrastructure decisions. This last point is important because some communities in California are more vulnerable to both climate impacts and infrastructure failure due in part to decades of underinvestment and disinvestment, especially in many low-income communities, communities of color, and tribal communities.

When done right, the results can be innovative infrastructure solutions, like the Blue Lake Rancheria microgrid, that bring social, economic, health, and environmental benefits to Californians AND protect us from the weather extremes we are inevitably facing.  More examples of climate-smart principles in action are described in the white paper, and some are shown in the accompanying StoryMap.

We’re just getting started

The Golden State is beginning to integrate climate change into its plans and investments and recently released high-level guidance for state agencies. These and other efforts underway at the state level must be accelerated and implemented in a consistent and analytically rigorous, climate-smart manner.

This is especially important in light of the billions of taxpayer dollars the state is planning on spending on new long-lived infrastructure projects. Many more billions will be spent on maintenance and retrofitting of existing infrastructure over the next few years. These projects must be able to function reliably and safely despite worsening climate impacts over the coming decades. Otherwise, we risk building costly systems that will fail well before their intended lifespans.

Barriers can be overcome

There are still many reasons why public infrastructure is not being upgraded or built today in a more consistently climate-smart way. They generally fall into three categories: (1) inadequate data, tools, and standards; (2) insufficient financial and economic assessments and investments; and (3) institutional capacity and good governance are lacking.

For example, many engineers, planners, and other practitioners still don’t have enough readily usable information to easily insert climate impacts into their existing decision-making processes and economic analyses. In addition, there has not been enough attention focused on the unique risks and infrastructure vulnerabilities faced by low-income communities, communities of color, and other underserved communities.

The UCS white paper includes several recommendations on how to overcome the barriers we identified. They focus on ways to improve and accelerate the integration of our climate-smart principles into public sector infrastructure decisions. For instance, they range from increasing state and local government staff’s technical capacity and updating standards and codes to better incorporating climate-related costs and criteria, as well as climate resilience benefits, into project evaluations and funding decisions. Others include better planning in advance for more climate-smart disaster recovery efforts, ensuring better interjurisdictional coordination at the local and state government levels, and addressing the funding gap. Additional recommendations and specifics can be found in the paper. All infrastructure solutions should help advance more equitable outcomes, so equity is integrated throughout these recommendations

Building to last? There’s reason for optimism

Progress is being made, as evidenced by the recent state actions mentioned above and a growing number of climate-smart projects and local solutions. For example, Los Angeles has begun a process to update its building codes, policies, and procedures, called Building Forward L.A. San Francisco is incorporating sea level rise into its capital planning. Plus, there’s an ever-expanding list of novel funding mechanisms for these types of infrastructure investment. But we need more, and soon, to help inform the tough decisions ahead as we adapt to climate change and invest in long-lived infrastructure projects. Thoughtful implementation of our recommendations can help clear the way.

California governments should grab hold of the opportunities before them to spend limited resources in climate-smart ways that increase our infrastructure’s ability to provide California’s communities and businesses with the needed services to thrive now and in a changing climate future.

Coal-burning Dynergy Wants a Handout. Will Illinois Give It to Them?

UCS Blog - The Equation (text only) -

Photo: justice.gov

Last week marked the end of the Illinois General Assembly’s 2017 veto session. Fortunately, Dynegy failed in its latest attempt to have the legislature bail out several of its coal plants in central and southern Illinois at the expense of local ratepayers.

But the fight isn’t over. Dynegy has been relentless in their efforts to force the public to pay for keeping their aging, polluting, and uneconomic coal power facilities open. Here are some pathways they are pursuing and why it’s important to stop them.

The legislature

Dynegy, a Texas-based company that owns eight coal plants in central and southern Illinois, introduced legislation (SB 2250/HB 4141) that would grant them a bailout for their uneconomic Illinois plants, while ratepayers foot the bill. These plants were built several decades ago: the bill would allow Dynegy to continue to emit harmful pollutants for years to come.

Last year alone, Dynegy’s Illinois plants emitted more than 32 million tons of heat-trapping carbon dioxide.

Dynegy claims that their Illinois coal plants are not being fairly treated in the current wholesale power market and if forced to close they would take hundreds of jobs with them. The proposed legislation would create a capacity-pricing system for Central and Southern Illinois, run by the Illinois Power Agency. Such a system would expectantly produce higher capacity prices, like those in Northern Illinois, and put more money into Dynegy’s coffers. Meanwhile, the higher capacity prices would be passed onto ratepayers.

Yet, Dynegy’s argument that immediate action is needed is unjustified. Ameren Illinois—the local power provider that purchases and delivers generation from Dynegy’s coal plants to customers—does not believe this is a resource adequacy issue in the short-term. And we agree. In 2016 the Illinois Clean Jobs Coalition (of which UCS is a member) worked tirelessly to pass a long-term vision for the state’s energy future with the passage of the Future Energy Jobs Act, which increases energy efficiency and renewable energy development in the state.

Prolonging the life of uneconomic and dirty coal plants would derail this clean energy future.

This bill got lots of push back at last week’s hearing. The opposition’s testimony noted that an immediate threat to grid reliability does not exist and passing the legislation would put a financial burden on Ameren Illinois ratepayers. It’s estimated that the proposal could raise Ameren Illinois customer’s electric bills upwards of $115 a year.

Avenue 2: the Pollution Control Board

In addition to its legislative efforts, Dynegy has been working with the Illinois EPA to rewrite the Illinois Multi-Pollutant Standard, which is a 2006 clean air standard for coal plants. The proposed changes to the rule would create annual caps on tons of sulfur dioxide and nitrogen oxide emitted by the entire coal fleet rather than on individual power plants. If approved, the new limit on sulfur dioxide would be nearly double what Dynegy emitted last year and the cap on nitrogen oxide emissions would be 79 percent higher than in 2016.

This proposal would allow Dynegy to close newer plants and run older and dirtier plants harder. Meanwhile, Illinois communities will get increased air pollution, and some will still be faced with job losses.

Not just an Illinois issue

While some blame environmental regulations for the ailing coal industry, a recent report from the Trump administration’s Department of Energy confirms the major primary reasons coal plants nationwide have been faced with economic woes are  low natural gas prices and flat electricity demand. Struggling coal plants aren’t just an Illinois issue. The role of coal in the electricity sector is on the decline nationwide, while the increase of wind and solar presents opportunities for communities, businesses, and policymakers.

Our recent report A Dwindling Role for Coal: Tracking the Electricity Sector Transition and What It Means for the Nation examines the historic transition of the US electricity sector away from coal and towards cheaper, cleaner sources of energy. Since 2008, more than one-fifth of US coal generation has either retired or converted to different fuels, with significant benefits to public health and the climate. This transition has reshaped the power sector and will continue to do so.

What’s next

It’s expected Dynegy will be back in 2018 with similar legislation. And the Illinois Pollution Control Board hearings will be held on January 17 in Peoria and March 6 in Edwardsville.

Recently, a third pathway for Dynegy has surfaced, a stakeholder process that will kick off at the end of the  month to discuss the potential policy opportunities that are laid out in a report requested by Governor Rauner and written by the Illinois Commerce Commission. The white paper addressed current questions about resource adequacy in central and southern Illinois.

Speak up!

Tell Governor Rauner, and your state legislators, to oppose a Dynegy bailout that would prolong the life of uneconomic coal plants in the state, and would have negative public health impacts for Illinois residents. Illinois needs to transition away from old, dirty, and costly fossil fuels, and continue to increase development of renewable energy and energy efficiency in the state.

Photo: justice.gov

Hey Congress! Here’s Why You Can’t Scrap The Electric Vehicle Tax Credit

UCS Blog - The Equation (text only) -

The fate of the federal tax credit for electric vehicles hangs in the balance. The House version of the GOP-led tax plan removes it entirely while the Senate version (as of Friday, November 17th) keeps it on the books. As lawmakers work to combine the House-passed bill with the Senate version, let’s examine why the EV tax credit shouldn’t be eliminated.

What is the federal tax credit for electric vehicles?

Section 30d of the tax code gives electric vehicle buyers up to $7,500 off their tax bill – or allows leasing companies to receive the credit and lease EVs for lower rates.

The credit is scheduled to phase out for each automaker that surpasses 200,000 EV sales. Some of the early entrants to the EV scene, like Tesla, General Motors and Nissan, are forecast to hit the 200,000 limit in 2018, while others, like BMW, Volkswagen, and Ford, are relying on the federal tax credit to offset the price of EVs that are set to hit dealerships in the next couple years.

What has America gotten for investing in EVs?

The EV tax credit has stimulated a market for vehicles that are cheaper to drive, pollute half as much, and offer a simply better driving experience compared to gas-powered vehicles. If you think that automakers would have produced EVs without the prompting of state and federal policy, may I remind you that automakers fought tooth and nail against seatbelts and air bags, improving fuel efficiency, and pretty much every other vehicle-related regulation that has ultimately benefitted public health and safety. Consumers deserve the opportunity to choose clean vehicles, and the federal tax credit has made that choice easier to make by offsetting the upfront cost of EVs that is often higher than comparable gas vehicles.

The tax credit has also spurred domestic automakers to get in on the EV game. American companies like General Motors and Tesla sell EVs in all 50 states, and are competing with foreign auto giants to become the global leader in EV sales. At a time when EV demand is poised to skyrocket in other countries, eliminating the federal credit will hamper domestic automaker efforts to both sell EVs on their own turf and maintain their global competitiveness.

Federal support for EVs won’t be needed forever

As I’ve previously discussed, the federal tax credit is the most important federal policy supporting the EV market, but won’t be needed forever. Battery costs are forecast to continue their decline, with some projections showing EVs becoming price competitive with gasoline-fueled vehicles in the mid-2020’s. By making EVs cost competitive today, the federal tax credit has helped EVs gain a fingerhold in a market monopolized by gasoline-powered vehicles that have had over a century to mature. Removing the credit now is premature, and will cause EV sales to suffer at a time when the market is just beginning to gain traction.

What will happen to the EV market without the credit?

Even if the federal tax credit is eliminated, the California Zero Emission Vehicle Program will still require automakers to sell EVs in California and the 9 other states that adopted the ZEV program. This program will require EV sales in states that comprised about a quarter of the U.S. vehicle market, so EVs will certainly remain available for sale. Other state support for EVs, like a $5,000 tax rebate in Colorado, will survive too. For state-level EV incentives in your area, check out this handy guide. EVs will also remain cheaper to drive, and a smart choice for millions of Americans who have a strong demand for the technology. That’s the good news.

The bad news is that one of the primary hurdles to more EV adoption is their price (along with access to charging in multi-unit dwellings and the lack of a cheap EV SUV (see Tesla Model Y). So taking away a policy that directly addresses this barrier will make it harder to own an EV, and it will hurt sales. Georgia removed a state tax credit for electric vehicles, and sales dropped an estimated 90% in the following months. I’m not expecting as dramatic as a drop if the federal credit is removed, but EV sales will drop because they will become more expensive and automakers will have less incentive to making them available in the U.S.

So, join UCS in telling Congress that you deserve more clean vehicle options, and that the EV tax credit is a key federal policy that makes it easier to own an EV. Also keep an eye on the UCS website for additional ways you can get involved, and if you are considering an EV, getting one now might be a good option if you are looking to save at least $7,500 of its sticker price.

Giving Thanks to Climate Researchers of the Federal Agencies

UCS Blog - The Equation (text only) -

Most of my science career I worked for the Department of Energy as a climate modeler and numerical expert at the Oak Ridge National Laboratory. Since my retirement in 2010 I have written a text on computational climate modeling and taught graduate level engineering classes on climate science at the University of Tennessee. I had the privilege of working with many talented and dedicated scientists and hate to see their work go unappreciated because climate has become such a politicized issue. In particular, the recently released Fourth National Climate Assessment (NCA) Special Science report is the culmination of many years, even decades of scientific focus that the Congress and the nation should study with an open mind and use to reset the climate discussion in the United States.

In the early 1990’s I was one of the principals organizing an “Inter-agency agreement’’ between the Department of Energy (DOE) and the National Science Foundation (NSF). Our researchers were called the CHAMMPions (a long acronym worth remembering as Computer Hardware, Advanced Mathematics, Model Physics, Inter-agency Organization for Numerical Simulation). Most of us were new to climate research with my own background in applied mathematics. The congressionally mandated National Climate Assessment of 1990 had not found any U.S. based modeling groups producing a high-quality climate model. They borrowed the Canadian and Hadley Center models to complete the first US NCA in 2000. A little bit of national pride and the opportunity to one up the rest of the international community by using U.S. developed high performance computers was a timely motivation for our group. The models we developed and continued to improve through the 1990’s ad 2000’s contributed to many national and international studies, in particular the CMIP (Climate Model Inter-comparison Project) study series sponsored by the DOE. We faithfully followed through on giving policy makers better tools for making informed decisions. Focusing on the science and not the politics supported our DOE sponsors through a variety of administrations.

As a DOE funded climate researcher for 20 years, I had a privileged view of the motivations behind DOE climate research. It all started with the first Secretary of Energy, James R. Schlesinger. He read a report from the Russian scientist, Mikhail Budyko, suggesting the link between earth’s climate and CO2 levels in the atmosphere, a physical theory of climatology. Knowing that the department could not ignore this connection, he asked his department heads what they were going to do about it. This was the start of DOE’s exemplary Carbon Dioxide Effects and Assessment Program in 1977.

The model that the inter-agency agreement developed is now one of the worlds most respected models. It is open source meaning that anyone can see what is in it and even new groups are welcome to contribute new physics or chemistry or ecology to the earth system modeling effort. The Climate Science Special Report, Fourth National Climate Assessment, Volume I is the first to provide regionally specific results. The global temperature is not the only climate parameter that can now be discussed with confidence. For example, one of the findings pertains to extreme events from heavy rainfall to heatwaves that can impact human safety, infrastructure and agriculture.

This kind of detail would not have been possible without the new capabilities that the U.S. modeling effort provided. Indeed, the report draws from the results of many modeling groups by measuring the skill of different models compared to the observational record.

The scientists I have worked with through the years in these inter-agency projects have performed a service to the nation with their dedicated focus on staying true to the science and providing usable information for policy makers. I for one am grateful for their effort and support continuing to invest in our federal scientists to help move forward on research for solutions to tackle the world’s most pressing problems. This Thanksgiving, I give thanks to the research capabilities and resources of the National Lab system and my colleagues who always put science first.

 

Dr. John. B. Drake was a researcher and group leader at the Oak Ridge National Laboratory for 30 years and lead the climate modeling efforts at ORNL from 1990 to 2010.  Since his retirement from ORNL, he has taught graduate courses on climate modeling in the Civil and Environmental Engineering Department at the University of Tennessee and conducted research into the impacts of climate change. 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Tesla, Electric Semi-Trucks and Equity

UCS Blog - The Equation (text only) -

A truck believed to be Tesla's was spotted last month.

Today is the unveiling of Tesla’s electric semi-truck. There’s been a lot of interest in this truck since it was referenced last summer in Tesla’s master plan. As sales indicate, Tesla makes sought-after electric cars and the potential for the company to replicate this success in the heavy-duty sector is an exciting prospect for clean air and climate change.

While Tesla isn’t the first company to unveil an electric big rig, its likely the first one many people have heard about. Electric truck technology – spanning delivery trucks, garbage trucks, transit buses, school buses, and semi-trucks – exists and is ready to be deployed. The more people that know about these vehicles, the better for our climate and air quality.

As the unveiling nears, excitement about Tesla’s truck has been tempered by news about the company’s labor conditions and accusations of discrimination at the company. While zero-emission trucks are critically important, so are safe and equitable workplaces. Fair work conditions go hand-in-hand with the long-term success of any business.

My hope is that Tesla becomes recognized for the quality of its workplace as much as the quality of its vehicles. I personally believe this is possible.

Job agreements, like the one made between Jobs to Move America and the electric vehicle maker BYD, are one example of how companies can do good for their employees and communities. The Greenlining Institute found that with the right job-training and hiring efforts, truck and bus electrification can be a catalyst for economic opportunity in underserved communities and help overcome racial inequities in wealth and employment.

So why are heavy-duty electric vehicles important in the first place?

Trucks and buses make up a small fraction of vehicle population, but a large fraction of vehicle emissions. In California, for example, heavy-duty vehicles, make up 7 percent of vehicles, but 33 percent of NOx emissions from all sources, 20 percent of global warming emissions from the transportation sector, and emit more particulate matter than all of the state’s power plants, see here.

Note, heavy-duty vehicles are defined here as having gross vehicle weight ratings greater than 8,500 lbs, e.g., a small moving truck.

Electric trucks, whether manufactured by Tesla or anyone else, are essential to solving climate change and reducing air pollution. On California’s grid today, a heavy-duty electric vehicle with middle-of-the-road efficiency has 70 percent lower life cycle global warming emissions than a comparable diesel and natural gas vehicle. Electric vehicles also don’t have any tailpipe emissions of NOx, particulate matter, or other pollutants. What this means for communities, especially those near freight corridors, is lower risks from the harmful consequences of dirty air.

What about the performance of electric trucks?

We’ve already seen how Toyota’s fuel cell electric truck stacks up against a diesel truck in terms of acceleration. High torque (i.e., ability to move from a standstill) of electric motors compared to combustion engines is something all electric vehicles excel at.

Given the class leading acceleration and battery range of Tesla’s cars, we can expect similar high performance from its electric truck. Other manufacturers are operating or have unveiled battery and fuel cell semi-trucks with ranges of 100 miles (BYD, Cummins) to 200 miles (Fuso, Toyota, US Hybrid). If reports are true, Tesla’s semi-truck could travel 200 to 300 miles on a single charge.

Zero-emission trucks offered or in development from BYD, Cummins, Toyota, and Daimler (Mitsubishi Fuso).

Despite the image of long-haul, “over-the-road” trucking, 100 to 200 miles of range can meet the needs of many heavy-duty trucks with local and regional operations. A range of 300 miles would be the longest by 80 miles and put to rest any hesitations about range for many local/regional (“day cab”) applications.

In California, there are 20,000 semi-trucks serving ports in the state. So-called “drayage” trucks deliver cargo to and from ports and warehouses in the region and are excellent candidates for electric trucks with today’s range. Conversion of these trucks alone to zero-emission vehicles would have significant air quality benefits for communities near ports and warehouses.

Cross-country trucking is a bigger challenge for electric trucks, but success in local operations is the first step to proving the functionality and economics of moving freight over longer distances with zero-emission vehicles. Tom Randall at Bloomberg shows scenarios under which Tesla’s truck could make economic sense for day cab or over-the-road applications.

In all, the momentum we’re seeing across the industry for zero-emission trucks is incredibly exciting. And just as we hold manufacturers and policy makers accountable for clean air, we must do the same for good jobs.

Reddit Clockwise: BYD, Cummins, Toyota, Daimler

4 Ways to Discuss Congressional Budget Riders at the Dinner Table this Thanksgiving

UCS Blog - The Equation (text only) -

Holiday gatherings with the family can be awkward, especially if you aren’t prepared for the inevitable table talk. Feeling like you don’t have enough fodder to sustain a conversation at the Thanksgiving dinner table this month?

Fret not! Every year around this time, my colleagues write about the budget process as the clock ticks for Congress to pass a clean budget – that is, a budget free from “poison pill” policy provisions and seemingly innocuous regulatory process riders that would hamper agencies from utilizing the best available science in rulemaking. These anti-science riders are extraneous special interest policies tacked onto a must-pass spending bill, a sort of parasitic mutualism, if you will.

This year, I have a gift for our readers ahead of the holidays: a brief list of harmful anti-science riders that would weaken science-based safeguards, potentially putting the health and safety of families at risk, repurposed as a guide to navigating uncomfortable silence and forced interactions with your family at Thanksgiving.

1. Start with an icebreaker

If it’s been a while since you’ve seen your least favorite Uncle Stewart, or your cousin Meg has brought a new date, you might consider starting with an icebreaker to relieve tension. Try this one:  A rider to “legislate” that the burning of trees for energy is positive for climate change has been proposed. This language encourages burning trees to generate electricity and ignores scientific evidence on impacts of carbon emissions. Who needs an icebreaker if the sea ice continues melting at record levels?

2. Share a story from your past

Take a stroll down memory lane and regale your guests with tales from the days of yore. Here’s a crowd favorite: In 1996, following the release of a study funded by the Centers for Disease Control and Prevention (CDC) that found keeping guns in the home increases the risk of homicides in the home, the National Rifle Association successfully lobbied former Congressman Jay Dickey to target the CDC’s funding. Congressman Dickey introduced the provision that he would later come to regret, sneaking it into a must-pass spending bill. Now, over 20 years later, the CDC is still unable to research gun violence as a public health issue, though current events (including the recent tragedies in Las Vegas,  Sutherland, Texas, and Rancho Tehama, California) and statistics show the need is there.

3. Talk about the weather

A tried and true small-talk starter, who can resist commiserating about the sweltering heat we endured this year, even as the temperatures have finally dropped? Now is the time to casually mention the proposal that would delay implementation of science-based standards, like the EPA’s most recent update to ground-level ozone, which is solely based on public health. If this passes, companies would be allowed to pollute at levels currently deemed unsafe, which would contribute to an increase in days with unhealthy ozone levels and increase risk of respiratory illnesses – risks that are exacerbated by an increase in heat waves caused by climate change (see: icebreaker).

4. Give thanks

There are many things to be thankful for, but often the most important ones go unnoticed. This year, remember to lift your glass in thanks – to clean water. Give a toast to the Clean Water Rule, which extends protections of waters under the Clean Water Act to include the streams and wetlands that feed drinking water sources for over 117 million people nationwide. Don’t forget to mention the rider that would permit the administration to ignore scientific and public input as Scott Pruitt’s EPA attempts to withdraw the Clean Water Rule. The rule was borne out of extensive public engagement and rigorous scientific analysis that the EPA administrator has chosen to set aside.

And as your mother stands poised to carve the golden turkey, remember to give thanks to the Endangered Species Act of 1973 for offering protections to the fowl’s cousin, the greater sage grouse. A rider would allow policymakers to overrule biologists and wildlife managers when it comes to protecting threatened and endangered wildlife, such as the gray wolf and the oft-fought-over sage grouse.

While this list of “poison pill” riders is by no means exhaustive, there are some great dinner-table conversation starters that are sure to keep the family engaged in a riveting discussion they’ll be talking about for years to come. The anti-science riders above have all been introduced this year and negotiations over which ones to include in a final spending deal are happening right now (and remember, none of them should be included, because we want a clean budget free from “poison pill” riders).

If you didn’t manage to invite your representatives to dinner this Thanksgiving, be sure to take the time to tell them to pass a clean budget with no anti-science “poison pill” riders this holiday.

Seven States Take Big Next Step on Climate: Here’s the What, Why, and How

UCS Blog - The Equation (text only) -

On Monday, November 13,  a bi-partisan group of seven states  (NY, MD, MA, CT,  RI, DE and VT), and the District of Columbia announced that they will seek public input on how to craft a regional solution to greenhouse gas emissions from the transportation sector, now the largest source of CO2 emissions in the region. An announcement to conduct listening sessions may not sound like a big deal, but it is. Here’s why:

First, this region has been successful at reducing emissions from the electric sector, but transportation is lagging behind, as this graph shows:

Source: Energy Information Administration Data

All of these states have committed to economy-wide goals that will be impossible to reach without ambitious policies to reduce pollution from transportation. Monday’s statement demonstrates that policy leaders understand that transportation is the next major frontier in the fight against global warming in the Northeast.

Second, a public conversation is necessary. For several years, these states have talked internally through their departments of energy, environment, and transportation, about how to cut transportation emissions. When I served as a commissioner of the Massachusetts Department of Environmental Protection, I was part of those conversations, and they have yielded a number of promising ideas.

But policies that are truly worthy and lasting can’t be hatched in isolation from the public. Public engagement is needed to get the best ideas out on the table, test assumptions, gauge political support, and persuade the skeptical. The states’ announcement shows that the states are serious, and that they are going about this in the right way.

Third, once the states announce a goal (as they have done here), and encourage the public to provide input to it, they create the expectation that action will follow: doing nothing becomes a much harder option. Once these listening sessions begin region wide, as they already have in Massachusetts, state leaders will see that their constituents want clean, affordable transportation, and that they are prepared to invest in that. Thus, the conversation will change from “whether” to implement a regional solution to “how” to do so.

In this regard, it is intriguing that on the day of the announcement, the states also released a white paper on one particularly promising approach—a regional “cap and invest” program.    A cap and invest program would build upon this region’s success with the Regional Greenhouse Gas Initiative (RGGI), which has helped to dramatically lower emissions from the electric sector while creating jobs and reducing consumer costs.

The program would set an overall cap on regional transportation emissions, require fuel distributors to purchase “allowances” for the right to sell polluting fossil fuels such as gasoline and diesel, and re-invest the proceeds in improved mass transit, electric cars and buses, affordable housing located near transportation centers, and other proven ways to make clean transportation available to all. The white paper does an excellent job of identifying how such a scheme would work under our existing fuel distribution network. (For more information on this approach, read my op-ed and the blog by my colleague Dan Gatti.)

I encourage UCS members and the public to attend these listening sessions and publicly support a bold regional solution. And I applaud the leaders of these states for taking a critical next step. State leadership, particularly when it is bi-partisan, is the way that the United States can best stay on track to meet its climate goals and assure an anxious world that we are still in the fight, notwithstanding the Trump administration’s abdication of leadership.

UCS to Nuclear Regulatory Commission: Big THANKS!

UCS Blog - All Things Nuclear (text only) -

This spring, I ran into Mike Weber, Director of the Office of Nuclear Regulatory Research for the Nuclear Regulatory Commission (NRC), at a break during a Commission briefing. The Office of Research hosts a series of seminars which sometimes include presentations by external stakeholders. I asked Mike if it would be possible for me to make a presentation as part of that series.

I explained that I’d made presentations during annual inspector conferences in NRC’s Regions I, II, and III in recent years and would appreciate the opportunity to reach out to the seminars’ audience. Mike commented that he’d heard positive feedback from my regional presentations and would welcome my presentation as part of their seminars. Mike tasked Mark Henry Salley and Felix Gonzalez from the Research staff to work out arrangements with me. The seminar was scheduled for September 19, 2017, in the auditorium of the Two White Flint North offices at NRC headquarters. I appreciate Mike, Mark, and Felix providing me the opportunity I sought to convey a message I truly wanted to deliver.

Fig. 1 (Source: Union of Concerned Scientists)

The title of my presentation at the seminar was “The Other Sides of the Coins.” The NRC subsequently made my presentation slides publicly available in ADAMS, their online digital library.

As I pointed out during my opening remarks, the NRC staff most often hears or reads my statements critical of how the agency did this or didn’t do that. My presentation that day focused on representative positive outcomes achieved by the NRC. For that presentation that day, my whine list was blank by design. Instead, I talked about the other sides of my usual two cents’ worth.

Fig. 2 (Source: Union of Concerned Scientists)

I summarized eight positive outcomes achieved by the NRC and listed five other positive outcomes. I emphasized that these were representative positive outcomes and far from an unabridged accounting. I told the audience members that I fully expected they would be reminded of other positive outcomes they were involved in as I covered the few during my presentation. Rather than feeling slighted, I hoped they would feel acknowledged and appreciated by extension.

One of the eight positive outcomes I summarized was the inadequate flooding protection identified by NRC inspectors at the Fort Calhoun nuclear plant in Nebraska. The NRC issued a preliminary Yellow finding—the second highest severity in its Green, White, Yellow, and Red classification system—in July 2010 for the flood protection deficiencies. To help put that Yellow finding in context, the NRC issued 827 findings during 2010: 816 Green, 9 White, and 2 Yellow. It was hardly a routine, run of the mill issuance.

The plant’s owner formally contested the preliminary Yellow finding, contending among other things that Fort Calhoun had operated for nearly 30 years with its flood protective measures, so they must be sufficient. The owner admitted that some upgrades might be appropriate, but contended that the finding should be Green, not Yellow.

The NRC seriously considered the owner’s appeal and revisited its finding and its severity determination. The NRC reached the same conclusion and issued the final Yellow finding in October 2010. The NRC then monitored the owner’s efforts to remedy the flood protection deficiencies.

The NRC’s findings and, more importantly, the owner’s fixes certainly came in handy when Fort Calhoun (the sandbagged dry spot in the lower right corner of Figure 3) literally became an island in the Missouri River in June 2011.

Recall that the NRC inspectors identified flood protection deficiencies nearly 8 months before the Fukushima nuclear plant in Japan experienced three reactor meltdowns due to flooding. Rather than waiting for the horses to trot away before closing the barn door, the NRC acted to close an open door to protect the horses before they faced harm. Kudos!

Fig. 3 (Source: Union of Concerned Scientists)

The real reason for my presentation in September and my commentary now is to acknowledge the efforts of the NRC staff. My concluding slide pointed out that tens of millions of Americans live within 50 miles of operating nuclear power plants and tens of thousands of Americans work at these operating plants. The efforts of the NRC staff make these Americans safer and more secure. I observed that the NRC staff deserved big thanks for their efforts and my final slide attempted to symbolically convey our appreciation. (The thanks were way bigger on the large projection screen in the auditorium. To replicate that experience, lean forward until your face is mere inches away from your screen.)

Fig. 4 (Source: Union of Concerned Scientists)

Water in an Uncertain Future: Planning the New Normal

UCS Blog - The Equation (text only) -

Northern California breathed a sigh of relief this weekend as rain and cooler temperatures finally arrived in force after the devastating fires in October. Now the question is, what kind of a winter will we have, and in particular, how much snow and rain will we or will we not get?

After a four-year drought from 2013 to 2016 and an unprecedented rainy winter in 2017, I’m hoping for a normal winter and not another year of water rationing, land subsidence, dead or dying forests, flooding, infrastructure failures, or transportation disruptions.

The Great Water Supply Shift

But with climate change upon us, nothing is normal anymore. (UCS has discussed the issue of the changing paradigm for water management with climate change here, here, and here.) One thing we have learned in the last few years of “new normal” conditions is that we can no longer rely on past precipitation patterns to predict reliable water supplies for our future.

One way in which California’s water management is changing is our increased reliance on groundwater, in part because groundwater has traditionally been the state’s fallback when surface water has been in short supply. But during the drought, decreased precipitation and temperatures were so extreme that several groundwater basins wells were pumped literally dry, and in some areas pumped so much water out of the ground that the land above the basins subsided (or sank) several feet, causing damage to roads, bridges, and canals on the surface.

In 2014 during the height of the drought, California lawmakers were forced to grapple with the fact that extreme drought was putting unsustainable pressures on state groundwater basins, and passed the Sustainable Groundwater Management Act (SGMA).

SGMA requires active governance of groundwater basins in the state and says water managers must set “measurable objectives” in their plans to achieve “the sustainability goal for the basin.” One of the great challenges now is that water managers must create new groundwater basin plans at a time when they can no longer rely on yesterday’s climate to manage future water conditions. Instead, they must rely much more on scientific and quantitative tools, like climate models, to understand the kinds of conditions we could be facing over the next decades.

Report Shines a Light on Much-Needed Changes

Researchers and scientists at UCS and the Stanford Law and Policy Lab released a report today that says much more needs to be done to ensure adequate groundwater management, and, by extension, overall water management in an era of rapid climate change.

The report found nearly half of the 24 groundwater plans analyzed did not include the kind of quantitative analysis of climate change required by the state.

The researchers also found that state and federal water delivery projections that local agencies rely upon to make water management decisions are inconsistent and therefore confusing to use. They found that too often models were used inappropriately or with unreliable assumptions. For example, many agencies were not using a range of climate data but relying on “moderate” scenarios to plan—a bit like planning for a “moderate” earthquake rather than the maximum force that can result in damage to life and property.

The problem unstated in water circles is that many water managers are well into their careers and are unlikely to have had formal training in climate science or how climate is affecting precipitation and water supplies. Consultants who they rely on may not have training or incentives to do climate science well.

Many water managers are doing their best to cope with often imprecise state guidelines and conflicting information on climate science, especially when they may not have enough information to even know what kinds of questions they should be asking. Right now local water managers have no requirements or real regulatory guidance to understand or engage with climate science.

But that should not be an excuse to do nothing. A hallmark of our era is change that requires people to master new skills and information- for example, a car mechanic today needs to understand how to deal with complex electronics which wasn’t true in the past. Learning new information and skills should not be the barrier to good management.   The report also provides guidance for how and under what circumstances water managers should use particular climate models—a necessary and important start, but the challenge we face requires much more effort and resources to be met effectively.

Wanted: Support For Science

To ensure we have planned for the uncertainties of a changing climate, state water managers should be provided with, trained in, and encouraged to use the kinds of science and tools that will ensure a state with a world-class economy can cope with adequate water supplies under changing climate conditions. Anyone who lived through the last five years in California, when ultra-dry and ultra-wet conditions had widespread impacts on our lives, understands that living with extremes is not easy. But we have no choice. We must learn to cope more effectively with much more difficult conditions if we are to adapt successfully.

One Simple Trick to Reduce Your Carbon Footprint

UCS Blog - The Equation (text only) -

Photo: Rennett Stowe/CC BY 2.0 (Flickr)

Want to save the planet? Are you, like me, a young professional struggling to reduce your carbon footprint? Then join me in taking the train to your next professional conference.

Take the train and reduce carbon pollution while looking at this. Photo credit: Anna Scott.

Most of my low-carbon lifestyle is admittedly enforced on me by my student budget. I have no kids, bicycle to work, and share a house with roommates. What dominates my carbon footprint is the flights I take—I’ll be hitting frequent flyer status this year thanks to traveling for conferences, talks, and workshops (not to mention those flights to see my family during the holidays—even being unmarried doesn’t get me out of visiting in-laws overseas). This is a bittersweet moment for a climate scientist—my professional success gives me an opportunity to impact the world with my science, but is hurting the planet and leaving future generations with a mess that will outlive me.

There’s no silver bullet to fixing climate change, but I think scientists and science enthusiasts can start with ourselves.

Every year, together with 25,000 of my closest climate and Earth science buddies, I attend the American Geophysical Union meeting. (You may have heard about it last year on NPR).

Prof. Lawrence Plug calculated that the 2003 meeting generated over 12,000 tons of CO2. Since then, the meeting has more than doubled in size, suggesting that the carbon footprint is upwards of 25,000 tons of CO2 from flights alone.

Prominent scientists like Katherine Hayhoe have suggested that we shift to teleconferencing instead. I think this is great for small meetings of folks who already know each other, or for prominent scientists like Dr. Hayhoe, who have an established publication record and name recognition.

For the little folks like myself though, meetings offer tremendous opportunities to connect with colleagues at other institutions, meet potential collaborators, and scout new job opportunities. The ‘serendipitous interaction’ that meetings allow is similar to the design principles that tech firms like Google enact when designing their public spaces. This fall alone, I’ve filled a shoebox with business cards from colleagues working on similar problems, potential collaborators working in similar fields, and, most lucratively, established scientists who have news of post-doctoral fellowships and job opportunities.

This last point may be especially critical for minority scientists, who may lack the social networks needed to get jobs.

In short, I’m not switching to virtual anytime soon, mostly because I can’t see it paying off (yet—Katherine Hayhoe et al, if you’re reading this, hire me!). But I still need to reduce my carbon footprint.

My solution? Replace one conference travel flight with a train ride. Repeat every year. Last year, I took Amtrak’s California Zephyr from San Francisco to Chicago back from AGU’s fall meeting and crossed the Rockies next to a geophysicist explaining plate tectonics and identifying rocks.

The year before, I returned from New Orleans and wrote my thesis proposal while rolling through bayous, swamps, and pine forests of the Southeast.

(Don’t think you have time for this? I spent the trip writing a paper, now published in PLOS-ONE. Amtrak seats all come with electrical outlets and seatback trays that function terrificly as desks.)

Is this a practical solution for everyody? Nope, and I won’t pretend that it is. Your time might be better spent with your kids, or volunteering in your community, or maybe you want to drive instead- I don’t know your life. Train infrastructure is lacking in the US, and delays are common as Amtrak doesn’t own the tracks and must give way to commercial freight. But I maintain my hope that increased demand for train travel can spur future investment, sending a market signal that young people want to travel this way.

This year, I’ll be taking the train to AGU’s fall meeting in New Orleans from Washington DC.

I estimate that I’ll be saving about one ton of CO2 equivalent (calculation included radiative forcing). If you’re headed that way, I invite you to join me, tell your friends, or even just reflect on the possibility that low carbon alternatives to flying exist. We can’t fix everything. But if we all do our little part, we can accomplish something. And something is always better than nothing.

Anna Scott is a PhD student in the Earth and Planetary Science Department at the Krieger School of Arts and Sciences at Johns Hopkins. She holds a Bachelor’s degree in mathematics from University of Chicago, a Master’s degree in Applied Mathematics from the King Abdullah University of Science and Technology (KAUST), and a Master of Arts and Sciences in Earth Science from Johns Hopkins University.  She has installed sensor networks and led field campaigns in Birmingham (Al.), Nairobi (Kenya), and Baltimore, Maryland, as part of her thesis research on quantifying urban temperature variability and heat waves. She has been known to dabble in projects on regional hydrology, the climate impacts of aerosols, and North African precipitation. She recently started Baltimore Open Air, an air quality monitoring project that has designed, built, and deployed 50 air quality monitors in the Greater Baltimore regions. Anna will be taking Amtrak’s Crescent line to the 2017 American Geophysical Union’s fall meeting in December. She’ll be sharing the journey on social media using the hashtag #TrainToAGU.  

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

In Australia, Too, Shareholders Demand Climate Transparency from Fossil Fuel Companies

UCS Blog - The Equation (text only) -

This week, an Anglo-Australian company’s annual meeting could send a strong signal for companies’ climate risk disclosure around the world. BHP Billiton Limited, a multinational mining and petroleum conglomerate, will hold its Annual General Meeting (AGM) in Melbourne, Australia. Shareholders are calling for more complete disclosure of the company’s direct and indirect lobbying spending on climate and energy—and for BHP to end its membership in industry groups whose positions are inconsistent with its own. Such disclosure would be a big deal.

We know companies often outsource their lobbying to industry groups, but often there is no transparency about which groups they support or how these funds are being spent, allowing companies to maintain a climate-friendly public image while blocking climate action behind the scenes. Although the Union of Concerned Scientists (UCS) has not engaged directly with BHP on climate issues, we have made similar asks to other major fossil energy companies in our Climate Accountability Scorecard. Here are five things I’ll be watching as BHP meets with its shareholders this week.

1. Advances in climate attribution science

Cumulative emissions from 1854 to 2010 traced to historic fossil fuel production by the largest investor-owned and state-owned oil, gas, and coal producers, in percent of global industrial CO2 and methane emissions since 1751.

Decisions by major fossil energy companies to maintain their carbon-intensive business models despite the known harmful effects of their products have had quantifiable impacts on our climate in recent decades. As a leading producer of coal, oil, and natural gas (among other commodities), BHP could face questions from investors about advances in the field of climate attribution science.

According to research by Richard Heede of the Climate Accountability Institute, BHP ranked #19 in industrial carbon pollution over the period 1854 – 2010.

A recent peer-reviewed study led by my UCS colleague Brenda Ekwurzel ranked BHP in the top 20 in terms of its contributions to the increase in global average temperature and sea level rise from 1980 to 2010—a time when fossil fuel companies were aware that their products were causing global warming.

Recent 1980–2010 emissions traced to top 20 investor-owned and majority state-owned industrial carbon producers, and their contributions to the rise in global mean surface temperature  (GMST).

BHP is among the 50 investor-owned carbon producers responsible for approximately 10% of the global average temperature increase and about 4% of sea level rise during that time period.

Today, BHP accepts climate science, stating on its website that “We accept the Intergovernmental Panel on Climate Change (IPCC) assessment of climate change science, which has found that warming of the climate is unequivocal, the human influence is clear and physical impacts are unavoidable.” The company welcomed the Paris Climate Agreement, and has conducted scenario analysis about the potential impacts on its portfolio of keeping the increase in global temperatures well below 2° Celsius.

However, like other major fossil energy companies, BHP presents a false choice between addressing climate change and advancing economic development and energy access, and assumes that no significant changes in its business strategies are needed in a carbon-constrained world.

2. Direct and indirect lobbying on climate change

Despite its stated positions on climate change, BHP maintains membership in trade associations and other industry groups that spread disinformation on climate science and/or seek to block climate action. (My colleague Genna Reed recently highlighted this industry tactic, among others, as part of The Disinformation Playbook).

The UK-based charity InfluenceMap gives BHP a score of D-minus in terms of its direct influence on climate policy and its relationships with trade associations like the World Coal Association and the Minerals Council of Australia (MCA)—the Australian counterpart of the National Mining Association in the US. A September 2017 InfluenceMap report, “Corporate Carbon Policy Footprint,” included BHP among the 50 companies that have the most influence on climate policy globally (among the world’s 250 largest listed industrial companies)—and found BHP to be one of 35 companies actively opposing climate policy.

In 2015, BHP CEO Andrew Mackenzie spoke to the US Chamber of Commerce, an umbrella business association known for taking controversial positions on climate change. Few of the US Chamber’s member companies publicly agree with its positions—which include refusing as recently as 2015 to acknowledge that global warming is human-caused. With such strong connections to industry groups that obstruct climate action, BHP is likely to have plenty to disclose in terms of its indirect lobbying, if shareholders demand it.

3. A shareholder resolution

Some of BHP’s shareholders are concerned about inconsistencies between BHP’s stated positions on climate change and those taken by trade associations of which it is a member. On behalf of 120 BHP shareholders, the Australasian Centre for Corporate Responsibility submitted a resolution calling for the company to:

  • disclose payments for direct and indirect lobbying on climate and energy policy;
  • assess whether advocacy positions taken by industry groups on specific Australian climate and energy policies are consistent with the company’s stated positions and its economic interests;
  • terminate its membership in industry bodies “where a pattern of manifest inconsistency is demonstrated” over the past five years.

Despite the recommendation of BHP’s Board that shareholders vote against the resolution, representatives of CalPERS, the Church of England, and HSBC voiced support for the resolution at BHP’s London AGM last month. The outcome of the vote won’t be known until after this week’s AGM in Melbourne.

There is precedent for the action requested by BHP shareholders. In announcing its decision to leave the American Legislative Exchange Council (ALEC) in 2015, Royal Dutch Shell said that the group’s stance on climate change “is clearly inconsistent with our own.”

Alignment between company positions and those of affiliated industry groups is increasingly considered a matter of good corporate governance. UCS’s Climate Accountability Scorecard called on major fossil energy companies to use their leverage to end the spread of climate disinformation by industry groups, publicly distance themselves from groups’ positions on climate science and policy with which they disagree, and publicly sever ties with groups if unable to influence their climate-denying positions. Shareholder resolutions requesting annual reporting on direct and indirect lobbying activities by Chevron, ConocoPhillips, and ExxonMobil won the support of around one-quarter of each company’s shareholders earlier this year. With growing support for such resolutions, I imagine BHP is getting nervous.

4. A preemptive pledge

In an effort to defuse support for the shareholder resolution, BHP has stated that by the end of the year, it will publish “a list of the material differences between the positions we hold on climate and energy policy, and the advocacy positions on climate and energy policy taken by industry associations to which BHP belongs.”

BHP also apparently played a role in forcing out the head of the MCA, signaling the company’s unhappiness with the trade association’s promotion of coal and opposition to clean energy policies.

Public acknowledgment of inconsistencies between BHP’s positions and those taken by industry groups it supports would be a step in the right direction—but it would beg the question of why the company is using shareholder money to fund groups that lobby against its own agenda.

Moreover, BHP’s current disclosures regarding trade associations should raise eyebrows (if not red flags). In its climate change questionnaire, CDP (formerly the Carbon Disclosure Project) asks companies to provide details about trade associations that are likely to take positions on climate change legislation—including whether the group’s position is consistent with the company’s position, and how the company attempts to influence the group’s position. In its CDP submission, BHP inexplicably reports that each of the climate positions of each of its trade associations—including MCA, the World Coal Association, and the American Petroleum Institute—is “consistent” with its own.

In light of BHP’s CDP report, proponents of the shareholder resolution must be wondering whether BHP’s promised list of material differences will be a blank webpage.

5. Shareholder rights and social license

It is not easy for shareholders in Australian companies to put forth resolutions for consideration at AGMs. The votes of BHP shareholders on the lobbying resolution will only be valid if they first approve a resolution amending the company’s constitution to allow non-binding shareholder resolutions. BHP’s Board also opposes this resolution.

BlackRock has indicated that it favors strengthening shareholder rights in Australia—under certain conditions. The support of BlackRock, the world’s largest asset manager, is enormously significant. This past May, BlackRock, along with Vanguard and Fidelity, helped deliver a decisive 62% majority of ExxonMobil shareholders in favor of a proposal calling for the company to report annually on how it will ensure that its business remains resilient in the face of climate change policies and technological advances designed to limit global temperature increase to well below 2°C.

The ExxonMobil shareholder vote demonstrates why the owners of Australian companies should have greater say on environmental, social, and governance issues. But not surprisingly, this exercise of shareholder power has sparked a backlash. The Financial CHOICE Act aims to make it much more difficult for shareholders in US companies to file resolutions, and the US Chamber of Commerce (remember them?) has issued a dangerous set of recommendations to “reform” the shareholder proposal process.

At last month’s London AGM, BHP Chair Ken MacKenzie pointed to social license to operate as one of five focus areas for the company: “Leadership with our social license can create a strategic advantage for BHP, and by extension, value for shareholders. Public acceptance and trust are an imperative for BHP. Without it, we have nothing.”

BHP has an opportunity to demonstrate leadership on climate change and earn public trust by setting a high standard of transparency and ensuring that its direct and indirect lobbying are aligned with its acceptance of climate science and its stated support for the Paris Climate Agreement goals. I will be watching to see whether the company moves in this direction at its AGM this week—and continuing to engage with major fossil energy companies over steps they should take to be more transparent and consistent in their advocacy on climate issues.

Fig. 2 from Frumhoff, Heede, Oreskes (2015) based on data from Heede (2014)

Whose Finger Is on the Button? Nuclear Launch Authority in the United States and Other Nations

UCS Blog - All Things Nuclear (text only) -

Throughout the 2016 presidential campaign, and perhaps even more since Trump’s election, the media discovered a newfound interest in the minutiae of US nuclear policy. One question in particular has been asked over and over—can the president, with no one else to concur or even advise, order the use of US nuclear weapons? Most people have been shocked and somewhat horrified to find that there is a simple answer—yes.

Starting a nuclear war shouldn’t be easy

The president has the sole authority to order a nuclear strike—either a first strike or one in response to an attack. Although there are people involved in the process of transmitting and executing this order who could physically delay or refuse to carry it out, they have no legal basis for doing so, and it is far from clear what would happen if they tried.

This belated realization (the system has been in place since the early Cold War) has prompted some ideas for ways to change things, including legislation restricting the president’s ability to order a nuclear first strike without a declaration of war by Congress. But more often it has prompted concern—and sometimes outrage—without a clear idea of how to fix the problem.

It may be useful to ask how other nuclear-armed states approach the problem of making a decision about the use of their nuclear weapons. How does the US compare to Russia, China, and other nuclear-armed states? Are there existing systems that rely on multiple people to order the use of nuclear weapons that the US might learn from?

To try to answer these questions, our new issue brief compiles information on the systems that other nuclear-armed states have in place to order the use of their weapons. While information is necessarily limited, and some of these systems may not completely correspond to what would happen in a true crisis, they still provide useful information about what these countries think is important when making a decision about the use of nuclear weapons. And, in most cases, that includes some form of check on the power of any single individual to order the use of these weapons by him or herself.

The current US process for deciding to use nuclear weapons is unnecessarily risky in its reliance on the judgment of a single individual. There are viable alternatives to sole presidential authority, and it is past time for the US to establish a new process that requires the involvement of multiple decision-makers to authorize the use of nuclear weapons. An investigation of how this decision works in other nuclear-armed states provides a good place to start.

 

Trump Nominee Kathleen Hartnett White Ignores Climate Change In Her Own Backyard

UCS Blog - The Equation (text only) -

Kathleen Hartnett White, President Trump’s pick to chair the White House’s Council on Environmental Quality (CEQ), testified at her Senate confirmation hearing on Wednesday and, like many Trump nominees to date, showed herself to be an unqualified, polluter-friendly ideologue who rejects mainstream climate science.

“Your positions are so far out of the mainstream, they are not just outliers, they are outrageous,” Massachusetts Sen. Ed Markey exclaimed at one point in clear exasperation. “You have a fringe voice that denies science, economics, and reality.”

What Markey failed to note, however, is that White has personally experienced climate change-related extreme weather events in her home state of Texas, and scientists say they are only going to get worse.

Unqualified from the start

White, who Trump previously considered for Environmental Protection Agency (EPA) administrator, is a cattle rancher and dog breeder who chaired the Texas Commission on Environmental Quality (TCEQ) — the Lone Star State’s version of the EPA — from 2001 to 2007 and was a member of the Environmental Flows Study Commission, the Texas Water Development Board, and the Texas Wildlife Association board.

Her qualifications for those positions? None.

White earned her bachelor’s and master’s degree in Humanities and Religion at Stanford, attended Princeton’s comparative religion doctoral program, and completed a year of law school at Texas Tech. It’s not quite the background one would expect for someone serving on environmentally related boards, let alone running the TCEQ. But in Texas, as in Florida and Wisconsin, ideology trumps science credentials, and White holds a politically correct pro-fossil fuels viewpoint.

That bias serves her well in her current job with the Texas Public Policy Foundation, a libertarian think tank funded by what Texans for Public Justice characterized as a “Who’s Who of Texas polluters, giant utilities and big insurance companies.” Among TPPF’s benefactors are Chevron, Devon Energy, and ExxonMobil; Koch Industries and its family foundations; and Luminant, the largest electric utility in Texas. White, who joined TPPF in January 2008, runs the nonprofit’s energy and environment program and co-heads its Fueling Freedom Project, whose mission is to “push back against the EPA’s onerous regulatory agenda that threatens America’s economy, prosperity, and well-being.”

Climate paranoia strikes deep

Recent media coverage of White’s nomination for the CEQ post has shined a light on her lack of scientific understanding — and her paranoia about the rationale for addressing climate change. She falsely claims that climate science is “highly uncertain,” characterizes it as the “dark side of a kind of paganism, the secular elite’s religion,” and argues that the “climate crusade,” if unchecked, would essentially destroy democracy.

That’s right. White believes the United Nations and climate scientists are bent on establishing a “one-world state ruled by planetary managers.” Further, she routinely trumpets the benefits of carbon emissions, insisting that carbon dioxide “has none of the characteristics of a pollutant that could harm human health.” Carbon is a good thing, she says, because “the increased atmospheric concentration of man-made CO2 has enhanced plant growth and thus the world’s food supply.” Never mind that farmers and ranchers in her own state have been whipsawed in recent years by devastating heat waves, drought, and floods, all linked to climate change.

At her confirmation hearing on Wednesday, White cited reducing ground-level ozone in Houston and Galveston when she chaired the TCEQ as her greatest accomplishment. But according to a recent editorial in the Dallas Morning News, she pushed for weaker ozone standards while she was at the helm of the agency.

“Her record is abominable,” the October 17 editorial stated. “White consistently sided with business interests at the expense of public health as chair of the Texas Commission on Environmental Quality. She lobbied for lax ozone standards and, at a time when all but the most ardent fossil fuel apologists understood that coal isn’t the nation’s future, White signed a permit for a lignite-fired power plant, ignoring evidence that emissions from the lignite plant could thwart North Texas’ efforts to meet air quality standards.”

Predictably, White also disparages renewable energy. “In spite of the billions of dollars in subsidies, retail prices for renewables are still far higher than prices for fossil fuels,” she wrote in her 2014 tractFossil Fuels: The Moral Case. “At any cost, renewable energy from wind, solar, and biomass remains diffuse, unreliable, and parasitic….”

In fact, fossil fuels have received significantly more in federal tax breaks and subsidies for a much longer time than renewables; new wind power is now cheaper than coal, nuclear, and natural gas; and the Department of Energy projects that renewable technologies available today have the potential to meet 80 percent of US electricity demand by 2050.

Ignoring the evidence

Most of Trump’s nominees for other key science-based positions — notably EPA Administrator Scott Pruitt — agree with White’s twisted take on climate science and renewables. What sets her apart, besides her penchant for calling advocates for combating climate change “pagans,” “Marxists” and “communists,” is her up-close-and-personal experience with climate change-related extreme weather events.

White and her husband, Beau Brite White, live in Bastrop County, an outlying Austin bedroom community, and own a vast cattle ranch of 118,567 acres — more than 185 square miles — in Presidio County, which sits on the state’s southwest border with Mexico.

Bastrop and Presidio counties are both struggling with drought due to low precipitation and high temperatures and, like the rest of Texas, suffered from an especially extreme drought in 2011. Part of a prolonged period of drought stretching from 2010 to 2015, the one in 2011 was the hottest and driest on record, and climate change likely played a significant role. A 2012 study published in the Bulletin of the American Meteorological Society found that the high temperatures that contribute to droughts such the one that struck Texas in 2011 are 20 times more probable now than they were 40 to 50 years ago due to human-caused climate change.

The Fourth National Climate Assessment report, released on November 3, agreed. “The absence of moisture during the 2011 Texas/Oklahoma drought and heat wave was found to be an event whose likelihood was enhanced by the La Niña state of the ocean,” the report, authored by scientists at 13 federal agencies, concluded, “but the human interference in the climate system still doubled the chances of reaching such high temperatures [emphasis added].”

The 2011 heat wave was particularly intense in Presidio County. According to Texas State Climatologist John Nielsen-Gammon, a meteorology professor at Texas A&M University, the county “achieved the triple-triple: at least 100 days reaching at least 100 degrees.”

Bastrop County, meanwhile, has become a tinderbox. Wildfires are happening there with greater frequency and intensity for a variety of reasons, including rising temperatures and worsening drought as well as population growth and development. In 2011, the county experienced the worst wildfire in Texas history, which destroyed more than 1,600 homes and caused $325 million in damage. Two years ago, in October 2015, the Hidden Pines Fire torched 7 square miles in the county and burned down 64 buildings.

White’s neighbors know better

White may refuse to acknowledge what is happening in her own back yard, but most of her neighbors realize that human-caused climate change is indeed a problem, according to polling data released last March by the Yale Program on Climate Communication. The survey, conducted in 2016 in every county nationwide, found that a majority of residents in Bastrop and Presidio counties — 67 percent and 78 percent respectively — understand that global warming is happening, while more than half of the respondents in both counties (52 percent in Bastrop and 62 percent in Presidio) know it is mainly caused by human activity.

Majorities in both counties also want something done about it. More than 70 percent want carbon dioxide regulated as a pollutant and at least 65 percent in both counties want states to require utilities to produce 20 percent of their electricity from renewables.

Given their responses, White’s neighbors in Bastrop and Presidio counties make it clear that if they were polled on whether she should become the next chair of a little-known but powerful White House office that oversees federal environmental and energy policies, a majority would likely say no — and with good reason: Unlike White, for them, seeing is believing.

Scientists, Please Don’t Listen to Scott Pruitt

UCS Blog - The Equation (text only) -

Everything about EPA Administrator Scott Pruitt’s directive to change the agency’s science advisory boards was damaging to the way that science informs policy at our nation’s premier public health agency. Mr. Pruitt based his action on a set of false premises. The logic of the action is fundamentally flawed and turns the idea of conflict of interest on its head. The specific appointments made are of people with deep conflicts of interest who have long espoused views concerning threats to public health divergent with the weight of scientific evidence on many issues.  In fact, in a slip of the tongue at the start of the press conference Mr Pruitt said, “We are here to change the facts [FACs]…I mean the FACA (Federal Advisory Committee Act committees).” He had it right the first time.

But in some sense the most disturbing statement Mr. Pruitt made was that scientists had to make a choice—either to pursue research grants or to engage in public service by serving on an advisory committee. This is a false choice of the first order. I hope scientists everywhere categorically reject the idea of a choice between doing research and serving as advisors to public agencies. In fact, I believe that it is scientists who have been and perhaps still are active researchers—on the cutting edge of knowledge—who should be providing scientific advice to government. Obtaining a government research grant never buys one’s loyalty to any particular policy position. That may be a convenient political talking point for Mr. Pruitt and his supporters like Cong. Smith (R-TX) or Sen. Inhofe (R-OK) who joined him for the announcement of his new directive, but it is still nonsense.

I believe that serving on a government advisory committee is public service and something that every scientist who has the opportunity and inclination should seriously consider. Many universities have public service as part of their core mission right alongside teaching and research. Serving on an advisory committee is one way that broader service to the public grows out of the day to day work of science. And it is exactly because one does outstanding research that your voice is so important as an independent source of scientific information in the process of making public policy.

So please don’t choose between public service and grant-funded research. I for one hope that more scientists will try to do both. Just because Scott Pruitt is hostile to scientists in public life doesn’t mean you should stay out—beyond serving in advisory committees, here are other ways you can put science to work for people. Don’t make the false choice Scott Pruitt called for.

 

Would Chemical Safety Measures Under Dourson Protect Military Families? Probably Not.

UCS Blog - The Equation (text only) -

Dr. Michael Dourson, a toxicologist with a history of providing consultation to the chemical industry, could become the head of the Office of Chemical Safety and Pollution Prevention (OCSPP) at the Environmental Protection Agency (EPA). Dourson has consistently defended the use of several chemicals found to pose major adverse health effects, manipulating his research in favor of industry interests. This could spell trouble for public health and safety, particularly in low-income communities and communities where residents are predominately people of color—which often includes military bases.

Over this past summer, ProPublica released a series of articles on the excessive toxic pollution problems at military bases. This immediately caught my attention: I work on chemical safety issues at UCS and spent my formative years living on army bases around the country. Although I had passing knowledge of the dangerous chemical agents at storage sites on a base in Aberdeen, Maryland (including nerve and blistering agents like mustard gas, sarin, tabun, and lewisite), I never once considered the impact exposure to toxics might pose to military personnel and their families, let alone the potential for exposure from burning of munitions, toxic releases, and proximity to Superfund sites. I naively assumed we were safe from harm, and didn’t give a second thought to the acrid odors wafting in the air. Who would knowingly put the people who fight for our country at risk in their own homes?

Can we trust Dourson to keep military families safe?

Judging from his track record of downplaying the health risks posed by several EPA-regulated compounds, including 1,4-dioxane, 1-bromoproane, trichloroethylene (TCE), and chlorpyrifos (which are currently under review), I don’t believe Dourson has the best interests of military families in mind. I worry that exposure to toxics on military bases may only worsen under his industry-partial leadership. I am not alone in my sentiments: retired U.S. Army Lieutenant Colonel and current U.S. Senator Tammy Duckworth (D-Ill.) has been critical of Dourson, calling his work on toxic chemicals “reckless”. She is acutely aware of the contamination and associated health effects at military bases like Camp Lejeune in North Carolina, where drinking water is highly contaminated by Perfluorooctanoic acid (PFOA). Interestingly, Dourson researched PFOA—a chemical linked to prostate, kidney and testicular cancer—and came to the convenient conclusion that a weaker safety standard than what EPA recommends would be just fine.

That is why this potential appointment is personal. If past administrations have done a substandard job of handling chemical concerns, putting an industry shill in charge of limiting and preventing exposure to toxic chemicals may result in even less protection for the public.

“[The sentiment is that] what we don’t know can’t hurt us. We don’t know [what is going on], we’re on a mission! But when you get out, you’re on your own. How do you know what’s going on in your system after 20 years [of service]?” – my dad, former Army drill sergeant, Airborne ranger, and Air Assault instructor on the lack of information given military personnel and families. He hates taking photos.

Conflict of interest is an understatement

It’s obvious Pruitt and his team intend to dismantle regulatory protections in favor of industry based on their actions to date, as well as the nominations and appointments of chemical industry advocates, including Dr. Nancy Beck (former representative of the American Chemistry Council) and Michael Dourson.

Dourson’s past work includes giving the green light on several chemicals that have been shown to have serious adverse health effects. He has even weighed in on TCE, a toxic chemical that is prominent on military bases, to ask to weaken the safety standards. See a list of locations where chemicals he has “blessed” have been found at alarming levels here. Of the states, towns, counties, and cities listed, I have lived in four at various stages of my life. Nearly two decades later, and I’m just now uncovering this. I’ll let that sink in.

We must defend the defenders

Veterans Day is approaching, which means food, retail, and recreation discounts for military veterans and active-duty personnel.  This is a nice (if not cursory) gesture to show our gratitude, but it’s still superficial at best considering the challenges our veterans and military families  face. Our country’s leaders profess to have the utmost respect for our military, even tearing the nation into a frenzy over a peaceful protest by claiming that kneeling for the national anthem disrespects those who have fought for our freedom. Is this the brownfield we want to die on? Our military need more than lip service and deserve better than Dourson.

Charise Johnson

Grand Gulf: Three Nuclear Safety Miscues in Mississippi Warranting NRC’s Attention

UCS Blog - All Things Nuclear (text only) -

The Nuclear Regulatory Commission (NRC) reacted to a trio of miscues at the Grand Gulf nuclear plant in Mississippi by sending a special inspection team to investigate. While none of the events had adverse nuclear safety consequences, the NRC team identified significantly poor performance by the operators in all three. The recurring performance shortfalls instill little confidence that the operators would perform successfully in event of a design basis or beyond design basis accident.

The Events

Three events prompted the NRC to dispatch a special inspection team to Grand Gulf:

(1) failure to recognize that reactor power fluctuating up and down by more than 10% during troubleshooting of a control system malfunction in June 2016 exceeded a longstanding safety criterion calling for immediate shutdown,

(2) failure to recognize in September 2016 that the backup reactor cooling system relied upon when the primary cooling system broke was unable to function if needed, and

(3) failure to understand how a control system worked on September 27, 2016, resulting in the uncontrolled and undesired addition of nearly 24,000 gallons of water to the reactor vessel.

(1) June 2016 Reactor Power Oscillation Miscue

Figure 1 shows the main steam system for a typical boiling water reactor like Grand Gulf. The reactor vessel is not shown but is located off its left side. Heat produced by the reactor core boils water. Four pipes transport the steam from the reactor vessel to the turbine. The steam spins the turbine which is connected to a generator (off the right side of Figure 1) to make electricity.

Fig. 1 (Source: Nuclear Regulatory Commission)

Periodically, operators reduce the reactor power level to about 65% power and test the turbine stop valves (labeled SV in Figure 1). The stop valves are fully open when the turbine is in service, but are designed to rapidly close automatically if a turbine problem is detected. When the reactor is operating above about 30 percent power, closure of the stop valves triggers the automatic shutdown of the reactor. Below about 30 percent power, the main steam bypass valves (shown in the lower left of Figure 1) open to allow the steam flow to the main condenser should the stop valves close.

Downstream of the turbine stop valves are the turbine control valves (labeled CV in Figure 1.) The control valves are partially open when the turbine is in service. The control valves are automatically re-positioned by the electro-hydraulic control (labeled EHC) system as the operators increase or decrease the reactor power level. Additionally, the EHC system automatically opens the three control valves in the other steam pipes more fully when the stop valve in one steam pipe closes. The EHC system and the control valve response time is designed to minimize the pressure transient experienced in the reactor vessel when the steam flow pathways change.

The test involves the operators closing each stop valve to verify these safety features function properly. During testing on June 17, 2016, however, unexpected outcomes were encountered. The EHC system failed to properly reposition the control valves in the other lines when a stop valve was closed, and later when it was re-opened. The control system glitch caused the reactor power level to increase and decrease between 63% and 76%.

Water flowing through the core of a boiling water reactor is heated to the boiling point. By design, the formation of steam bubbles during boiling acts like a brake on the reactor’s power level. Atoms splitting within the reactor core release heat. The splitting atoms also release neutrons, subcomponents of the atoms. The neutrons can interact with other atoms to cause them to split in what is termed a nuclear chain reaction. The neutrons emitted by splitting atoms have high energy and high speed. The neutrons get slowed down by colliding with water molecules. While fast neutrons can cause atoms to split, slower neutrons perform this role significantly better.

The EHC system problems caused the turbine control valves to open wider and close more than was necessary to handle the steam flow. Turbine control valves opened wider than necessary lowered the pressure inside the reactor vessel, allowing more steam bubbles to form. With fewer water molecules around to slow down the fast neutrons, more neutrons went places other than interacting with atoms to cause more fissions. The reactor power level dropped as the neutron chain reaction rate slowed.

When turbine control valves closed more than necessary, the pressure inside the reactor vessel increased. The higher pressure collapsed steam bubbles and made it harder for new bubbles to form. With more water molecules around, more neutrons interacted with atoms to cause more fissions. The reactor power level increased as the neutron chain reaction rate quickened.

Workers performed troubleshooting of the EHC system problems for 40 minutes. The reactor power level fluctuated between 63% and 76% as the turbine control valves closed too much and then opened too much. Finally, a monitoring system detected the undesired power fluctuations and automatically tripped the reactor, causing all the control rods to rapidly insert into the reactor core and stop the nuclear chain reaction.

The NRC’s special inspection team reported that the control room operators failed to realize that the 10% power swings exceeded a safety criterion that called for the immediate shut down of the reactor. Following a reactor power level instability event at the LaSalle nuclear plant in Illinois in March 1988, Grand Gulf and other boiling water reactors revised operating procedures in response to an NRC mandate to require reactors to be promptly shut down when the reactor power level oscillated by 10% or more.

EHC system problems causing unwanted and uncontrolled turbine control valve movements had been experienced eight times in the prior three years. Operators wrote condition reports about the problems, but no steps had been taken to identify the cause and correct it.

Consequences

Due to the intervention by the system triggering the automatic reactor scram, this event did not result in fuel damage or release of radioactive materials exceeding normal, routine releases. But that outcome was achieved despite the operators’ efforts but because of them. The operators’ training and procedures should have caused them to manually shut down the reactor when its power level swung up and down by more than 10%. Fortunately, the plant’s protective features intervened to remedy their poor judgement.

(2) September 2016 Backup Reactor Cooling System Miscue

On September 4, 2016, the operators declared residual heat removal (RHR) pump A (circled in red in the lower middle portion of Figure 2) to be inoperable after it failed a periodic test. The pump was one of three RHR pumps that can provide makeup cooling water to the reactor vessel in case of an accident. RHR pumps A and B can also be used to cool the water within the reactor vessel during non-accident conditions. Grand Gulf’s operating license only permitted the unit to continue running for a handful of days with RHR pump A inoperable. So, the operators shut down the reactor on September 8 to repair the pump.

Fig. 2 (Source: Nuclear Regulatory Commission)

The operating license required two methods of cooling the water within the reactor vessel during shut down conditions. RHR pump B functioned as one of the methods. The operators took credit for the alternate decay heat removal (ADHR) system as the second method. The ADHR system is shown towards the upper right of Figure 2. It features two pumps that can take water from the reactor vessel, route it through heat exchangers, and return the cooled water to the reactor vessel. The ADHR system’s heat exchangers are supplied with cooling water from the plant service water (PSW) system. Warmed water from the reactor vessel flows through hundreds of metal tubes within the ADHR heat exchangers. Heat conducted through the tube walls gets carried away by the PSW system.

By September 22, workers had replaced RHR pump A and successfully tested the replacement. The following day, operators attempted to place the ADHR system in service prior to removing RHR pump B from service. They discovered that all the PSW valves (circle in red in the upper right portion of Figure 2) to the ADHR heat exchangers were closed. With these valves closed, the ADHR pumps would only take warm water from the reactor vessel, route it through the ADHR heat exchangers, and return the warm water back to the reactor vessel without being cooled.

The operating license required workers to check each day that both reactor water cooling systems were available during shut down. Each day between September 9 and 22, workers performed this check via a paperwork exercise. No one ever walked out into the plant to verify that the ADHR pumps were still there and that the PSW valves were still open.

The NRC team determined that workers closed the PSW valves to the ADHR heat exchangers on August 10 to perform maintenance on the ADHR system. The maintenance work was completed on August 15, but the valves were mistakenly not re-opened until September 23 after being belatedly discovered to be mis-positioned.

Consequences

Improperly relying on the ADHR system in this event had no adverse nuclear safety consequences. It was relied upon was a backup to the primary reactor cooling system which successfully performed that safety function. Had the primary system failed, the ADHR system would not have been able to take over that function as quickly as intended. Fortunately, the ADHR system’s vulnerability was not exploited.

(3) September 2016 Reactor Vessel Overfilling Miscue

On September 24, Grand Gulf was in what is called long cycle cleanup mode. Water within the condenser hotwell (upper right portion of Figure 3) was being sent by the condensate pumps through filter demineralizers and downstream feedwater heaters before recycling back to the condenser via the startup recirculation line. A closed valve prevented this water from flowing into the reactor vessel. Long cycle cleanup mode allows the filter demineralizers to remove particles and dissolved ions from the water. Water purity is important in boiling water reactors because any impurities tend to collect within the reactor vessel rather than being carried away with the steam leaving the vessel. The water in the condenser hotwell is the water used over and over again in boiling water reactors to make the steam that spins the turbine-generator.

Fig. 3 (Source: Nuclear Regulatory Commission)

Workers were restoring RHR pump B to its standby alignment following testing. The procedure they used directed them to open the closed feedwater valve. This valve was controlled by three pushbuttons in the control room: OPEN, CLOSE, and STOP. As soon as this valve began opening, water started flowing into the reactor vessel rather than being returned to the condenser.

The operator twice depressed the CLOSE pushbutton wanting very much for the valve to re-close. But this valve was designed to travel to the fully opened position after the OPEN pushbutton was depressed and travel to the fully closed position after the CLOSE pushbutton was depressed. By design, the valve would not change direction until after it had completed its full travel.

Unless the STOP pushbutton was depressed. The STOP pushbutton, as implied by its label, caused the valve’s movement to stop. Once stopped, depressing the CLOSE pushbutton would close the valve and depressing the OPEN pushbutton would open it.

According to the NRC’s special inspection team, “operations personnel did not understand the full function of the operating modes of [the] valve.” No operating procedure directed the operators to use the STOP button. Training in the control room simulator never covered the role of the STOP button because it was not mentioned in any operating procedures.

Not able to use the installed control system to its advantage, the operator waited until the valve traveled fully open before getting it to fully re-close. But the valve is among the largest and slowest valves in the plant—more like an elephant than a cheetah in its speed.

During the time the valve was open, an estimated 24,000 gallons of water overfilled the reactor vessel. As shown in Figure 4, the vessel’s normal level is about 33 inches above instrument zero, or about 201 inches above the top of the reactor core. The 24,000 gallons filled the reactor vessel to 151 inches above instrument zero.

Fig. 4 (Source: Nuclear Regulatory Commission)

Consequences

The overfilling event had no adverse nuclear safety consequences (unless revealing procedure inadequacies, insufficient training, and performance shortcomings count.)

NRC Sanctions

The NRC’s special inspection team identified three violations of regulatory requirements. One violation involved inadequate procedures for the condensate and feedwater systems that resulted in the reactor vessel overfilling event on September 24.

Another violation involved crediting the ADHR system for complying with an operating license requirement between September 9 and 22 despite its being unable to perform the necessary reactor water cooling role due to closed valves in the plant service water supply to the ADHR heat exchangers.

The third violation involved inadequate verification of the ADHR system availability between September 9 and 22. Workers failed to properly verify the system’s availability and had merely assumed it was a ready backup.

UCS Perspective

Th trilogy of miscues, goofs, and mistakes that prompted the NRC to dispatch a special inspection team have a common thread. Okay, two common threads since all three happened at Grand Gulf. All three miscues reflected very badly on the operations department.

During the June power fluctuations miscue, the operators should have manually scrammed the reactor, but failed to do so. In addition, operators had experienced turbine control system problems eight times in the prior three years and initiated reports intended to identify the causes of the problems and remedy them. The maintenance department could have, and should have, reacted to these reports earlier. But the operations department could have, and should have, insisted on the recurring problems getting fixed rather than meekly adding to the list of unresolved problem reports.

During the September backup cooling system miscue, many operators over nearly two weeks had many opportunities to notice that the ADHR system would not perform as needed due to mispositioned valves. The maintenance department could have, and should have, not set a trap for the operators by leaving the valves closed when maintenance work was completed. But the operators are the only workers at the plant licensed by the NRC to ensure regulatory requirements intended to protect the public are met. They failed that legal obligation again and again between September 9 and 22.

During the September reactor vessel overfilling event, the operators failed to recognize that opening the feedwater valve while in long cycle cleanup mode would send water into the reactor vessel. That’s a fundamental mistake that’s nearly impossible to justify. The operators then compounded that mistake by failing to properly use the installed control system to mitigate the event. They simply did not understand how the three pushbutton controls worked and thus were unable to use them properly.

The poor operator performance that is the common thread among the trio of problems examined by the NRC’s special inspection team inspire little to no confidence that their performance will be any better during a design basis or beyond design basis event.

While We Aren’t Paying Attention, the Trump Administration is Making Products Less Safe

UCS Blog - The Equation (text only) -

Have you ever checked to see if a product has been recalled because of a safety concern? As a parent of a young child, I am deeply familiar with this task. Babies are expensive and buying used products cuts costs, but it’s crucial to check if products have been recalled because baby products can often be recalled for safety concerns. When you have a little one, you want to protect them as best you can. But now, the Trump administration is putting my family and yours at risk.

The Consumer Product Safety Commission: Keeping our families safe

To our nation’s benefit, there’s the Consumer Product Safety Commission (CPSC). The little-known federal agency plays a crucial role in making sure that the products we bring into our homes and trust with our families’ lives are safe. I depend on this every day when I put my child down for a nap, put him in a car seat, or give him a toy. Because of the CPSC, I trust that the crib won’t injure him, the car seat is built properly, and that his toys don’t have parts he can choke on.

You might only hear about these kinds of recalls when they’re high-profile like those scooters that everyone got for Christmas one year that had a tendency to catch fire or the exploding Android phone debacle. But the reason you don’t hear more about these issues is because the CPSC is doing its job. Scientists at the CPSC monitor product injuries and deaths, issue recalls and work with companies to help prevent unsafe products from ever reaching the market.

Dana Baiocco: A dangerous pick for CPSC commissioner

Now, the Trump Administration is threatening the CPSC’s ability to keep us safe. President Trump’s nominee for CPSC commissioner Dana Baiocco—who will be voted out of committee tomorrow on the hill—has spent her career defending companies whose products have harmed people (Check out this reporting from Sharon Lerner at the Intercept). When people fought for justice because their loved ones had mesothelioma from asbestos exposure because of the negligent company, Baiocco was making sure widows wouldn’t get their money. When Yamaha knowingly kept on the market unsafe ATVs that caused injuries and deaths of several people, including children, Baiocco worked to make sure the families didn’t get compensation. When Volkswagen was caught cheating on their emissions testing, Baiocco was there to defend them. And when the tobacco conglomerate R.J. Reynolds needed help defending harms caused by smoking, Baiocco was there too, defending the tobacco giant from cancer victims.

Clearly, Baiocco is the wrong choice for the CPSC. Nothing about this past gives me confidence that she’ll use science to make decisions in the public interest if she is appointed a CPSC commissioner.

Would Baiocco keep us safe from harmful flame retardants?

This year the CPSC is slated to work on organohalogen flame retardants. As my colleague Genna Reed reported last month, the CPSC made the science-based decision to phase out the harmful class of flame retardants from products, despite chemical industry opposition. This was a huge victory for science and for public health. I celebrated this move. No longer would I have to spend hours reading labels, pouring over scientific studies and buying costly foreign baby products to avoid exposing my child to these unsafe flame retardants.

Now the CPSC will be implementing that rule. Baiocco’s nomination will have a huge impact on how that implementation happens. Commissioners have a lot of power when it comes to implementation, timing, and overall agency priorities. Will harmful flame retardants be phased out under a proper timeline and sufficiently eliminated from products? If Baiocco becomes commissioner, this flame-retardant rule could be delayed or weakened in its implementation, and that won’t be a victory for anyone other than the companies that produce them.

The dangers of a politicized CPSC: The case of the lead lunch boxes

We don’t have to look too far to see the devastating consequences of a CPSC where science is compromised. In 2005, under the George W. Bush administration, the agency tested children’s lunchboxes and found unsafe levels of lead. In the case of one test on a Spiderman lunchbox, the agency found 16 times the federal standard for lead. Rather than immediately announce this finding and recall a potentially unsafe product, the CPSC changed their lead testing technique and employed an averaging scheme that scientists said underestimated the level of lead in the lunch boxes. With the backing of the vinyl industry, the CPSC continued to defend this testing method while allowing the product to stay on the market, potentially exposing children to lead poisoning.

The Senate Commerce Committee should vote no on Dana Baiocco for CPSC commissioner

As a mom, I worry a lot about the safety of my son. There is nothing more important to me than making sure he can grow up in a safe environment. I know I can’t keep him safe from every danger in the world, but I can make sure he’s surrounded by safe cribs, strollers, car seats, and toys. In order to do that though, I depend on a CPSC that uses science and works in the public interest.

And so I ask the members of the Senate Commerce Committee, do you trust that Baiocco will keep your family safe? Do you have confidence that she will make sure that my child and yours are protected from unsafe baby products? If a recall would be inconvenient to a company’s bottom line, would she still prioritize public safety over corporate profits? How will Americans know if products are safe to use in our homes? This isn’t just a policy preference. This could cost American lives, and Baiocco is not on our side. As a parent and a scientist, I urge you to vote no tomorrow for the safety of all Americans.

 

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs