Combined UCS Blogs

Farmers to Trump: Don’t Walk Away from Climate Action

UCS Blog - The Equation (text only) -

There’s a little good news from farm country. Last week, the National Farmers Union (NFU)—a grassroots organization representing 200,000 farmers, fishers, and ranchers with affiliates in 33 states—publicly urged President Trump to keep the United States’ commitment to global climate action.

I was thrilled, and a little surprised, though I shouldn’t have been.NFU has supported the Paris Agreement since its adoption in 2015, and the nation’s second-largest farm organization is progressive when it comes to environmental issues. Still, the NFU’s strongly-worded statement was a good reminder that farmers aren’t a monolith, and that while some farm groups have their heads stuck firmly in the sand, there’s hope for a future in which farmers help avert the worst impacts of climate change on the land and our food supply.

NFU farmers are climate leaders

In his statement last week, NFU president Roger Johnson put it simply: “The Paris Agreement is vital to enhancing the climate resiliency of family farm operations and rural communities, and it allows family farmers and ranchers to join carbon sequestration efforts that stimulate economic growth in rural America.”

At its annual convention in March 2016, NFU members voted to “lead the way” on climate change. The policy resolution they adopted notes that farmers and rural residents are “a large part” of the climate solution because of their role in generating renewable energy and sequestering carbon in soils. It commits NFU to educating its own members about ways they can “adapt to the effects of climate change on their respective operations, as well as the enormous economic benefits that homegrown renewable energy brings to our rural areas.”

And it endorses policy solutions, including a transition from fossil fuels to renewable energy and voluntary conservation practices that focus on water quality and quantity concerns. Not least, the NFU resolution urges Congressional funding of land-grant universities and the USDA “to do the necessary research to help farmers and ranchers better increase the water holding capacity and resiliency of our nation’s soils through changing cropping patterns, production and conservation practices, and carbon sequestration.” (Otherwise known as agroecology. Nearly 500 scientists agree.)

A year later, NFU is making good on its commitment. In addition to the statement about Paris, the organization has launched a change.org petition calling on Congress to include “opportunities to enhance climate resiliency and mitigate climate change” in the 2018 farm bill. That petition has more than 30,000 signatures. And through its Climate Leaders program and Facebook group, NFU has created a forum to spread awareness and spur action by farmers.

Climate action is good for farmers

NFU says it’s taking this stand because its members are on the front lines of climate change, and are already feeling the volatility of our changing climate. Indeed, it’s becoming increasingly clear that the nation’s farmers not only can be part of the climate solution, but they must in order to survive.

This hasn’t gone unnoticed by the media, as evidenced by a spate of recent coverage. A fifth-generation Iowa farmer describes (here, also cross-posted here) the climate challenges and opportunities he sees for himself and his fellow farmers. The New York Times earlier this year identified a subset of farmers in conservative states who are practicing climate-friendly agriculture without ever talking about the climate. And the Huffington Post has the stories of six farmers who are taking a whole range of actions on their land—employing water-conserving practices, or diversifying crops—in order to increase their climate resilience. (One of them even has a 35-page climate adaptation plan!)

Denial and delay put farmers at risk

At the same time, too many farmers will not publicly acknowledge climate change. An annual survey of Iowa farmers asked respondents about their views on climate change in 2011, and again two years later. The 2013 results moved slightly in the direction of agreement that climate change is happening and that humans are mostly to blame. Still, only 16 of farmers surveyed took that view, while a much larger fraction of respondents—nearly a quarter!—agreed with the statement that there is “not enough evidence to know with certainty whether climate change is occurring or not.”

This misperception is aided and abetted by the nation’s largest and most powerful farm organization, the American Farm Bureau Federation (Farm Bureau, for short). A lumbering dinosaur, the Farm Bureau continues to pretend climate change isn’t really happening, or if it is, no one can really know why. A cynical policy statement on its website sows doubt: “Some scientists,” the statement says slyly, have connected human activities to increased average global temperatures, and “some scientists” have predicted more extreme weather. Then it cuts to the chase: “Imposing regulations based on unproven technologies or science causes increased costs to produce food, feed, fuel and fiber without measurably addressing the issue of climate.” (emphasis added)

Such rhetoric inflames the worry of many farmers that accepting the reality of climate change will make them vulnerable to new costs, a very serious concern right now with rock-bottom prices for farm products, farm incomes plummeting, and debt escalating. That’s why it’s important to point out how climate-smart farm practices can help farmers save money on input costs, improve soil health, and perform better in drought and flood conditions. And how, instead of imposing new costs, this kind of farming could create new revenue streams for environmental services.

Farmers need information and technical support

While some farmers are plowing ahead with climate action and others are following the Farm Bureau’s non-lead, a third subset is uneasy about what climate change will bring but unsure of what to do. And this group hasn’t received enough help to date. Yes, the Obama USDA boosted climate-related research—spending more than $650 million since 2009, according to then-Secretary Vilsack last year—and in 2014 established a network of regional “climate hubs” to translate science into practical advice and assistance to farmers. But the nation’s farmers need even more information, education, and support, and they need to be hearing about the need for climate action from people they trust.

Unfortunately, President Trump and his agriculture secretary nominee Sonny Perdue (who may finally be confirmed by a Senate vote scheduled for next week) aren’t exactly inspiring confidence on that front. Like his would-be boss, Perdue has a history of public climate skepticism, and it’s an open question whether he’ll move to reverse progress made by his predecessor to help farmers cope.

All this is why NFU’s vocal support for real action to combat climate change and adapt to the reality of our climate future is so important.

So today I’d like to say thank you to the 200,000 farmers of NFU. We need you, and we’re glad to stand with you.

Why I March for Science: The Frightening Risks We Aren’t Talking About

UCS Blog - The Equation (text only) -

This post has also been published at ScienceNode.org.

“Thank you, Dr. Goldman. That was frightening.” Moderator Keesha Gaskins-Nathan said to me after I spoke last week as the only scientist at the Stetson University Law Review Symposium. My talk covered the ways that the role of science in federal decisionmaking is being degraded by the Trump administration, by Congress, and by corporate and ideological forces. Together these alarming moves are poised to damage the crucial role that science plays in keeping us all safe and healthy. This is why I will march for science this Saturday.

Science-based policy as we know it could change forever. Indeed, some of its core tenets are being chipped away. And a lot is at stake if we fail to stop it. We are currently witnessing efforts by this administration and Congress to freeze and roll back the federal government’s work to protect public health and safety, Congress’ attempts to pollute the science advice that decisionmakers depend on, and the appointment of decisionmakers who are openly hostile to the very missions of the science agencies they now lead.

A democracy rooted in science

We cannot afford to make policy decisions without science. This is why I will march. Photo: UCS/Audrey Eyring

America has a strong tradition of using evidence to inform policy. Past leaders of this country understood the value of making sure independent science—without the interference of politics—could inform government decisions. There are, of course, factors beyond science that go into policy decisions, but the scientific information feeding into a policy process should remain unaltered. This system works. While it is imperfect, this has by and large allowed the nation to ensure scientific integrity in policy decisions and prosper.

For example, under the Clean Air Act, air pollution standards are developed to protect public health. Let’s take ground-level ozone. Every five years, the EPA conducts exhaustive research on the relationship between ozone and health. A team of ozone experts from universities and other institutions across the country convene to discuss the science and make an official recommendation to the agency. The EPA then uses the scientific recommendation to set a new ozone standard.

This process allows science to be collected and debated separate from the policy discussion in a transparent way. This means the public can scrutinize the process, minimizing the potential for political interference in the science. The process also means the public will know if the policy doesn’t follow the scientific evidence and can hold decisionmakers to account (as they have in the past).  But largely, this process has worked. Even in the face of tremendous political and corporate pressures, the EPA sets science-based air pollution standards year after year.

Threats to science-based America

We cannot afford to make decisions any other way.  But now, this very process by which we make science-based policies in this country is under threat.

  • Our decisionmakers have deep conflicts of interest, disrespect for science, and aren’t being transparent. This is a recipe for disaster. How can our leaders use science effectively to inform policy decisions if they can’t even make independent decisions and don’t recognize the value of science? EPA Administrator Scott Pruitt, for example, this month said that carbon dioxide “is not a primary contributor to global warming.” (It is.) This blatant misinforming on climate science occurred on top of his extensive record of suing the agency over the science-based ozone rule I just described (among other rules). This type of disrespect for science-based policies from cabinet members is an alarming signal of the kind of scientific integrity losses we can expect under this administration.
  • Congress is trying to degrade science advice. A cornerstone of science-based policy is the role of independent science advice feeding into policy decisions. But Congress wants to change who sits on science advisory committees and redefine what counts as science. The Regulatory Accountability Act, for example, would threaten how federal agencies can use science to make policy decisions. Past versions of the bill (which has already passed the House this year and is expected to be introduced soon in the Senate) have included troubling provisions. One mandated that government agencies could only use science if all of the underlying data and methods were publicly available–including health data, proprietary data, trade secrets, and intellectual property. In another case, the bill added more than 70 new regulatory procedures that would effectively shut down the government’s ability to protect us from new threats to our health, safety, and the environment. It is a dangerous precedent when politicians—not scientists—are deciding how the scientific process that informs policy decisions should work.
  • Scientists face intimidation, muzzling, and political attacks. No one becomes a scientist because they want a political target on their back. But this is unfortunately what many scientists are now facing. While it won’t be enacted in its current form, the president’s budget shows the frightening priorities of the president, which apparently include major cuts to science agencies like the EPA, Department of Energy, and NOAA. Communication gag orders, disappearing data, and review of scientific documents by political appointees in the first month of the administration have created a chilling effect for scientists within the government. Congress has even revived the Holman Rule, which allows them to reduce the salary of a federal employee down to $1. It is easy to see how such powers could be used to target government scientists producing politically controversial science.
Hurting science hurts real people

Importantly, we must be clear about who will be affected most if science-based policymaking is dismantled. In many cases, these burdens will disproportionately fall to low-income communities and communities of color. If we cannot protect people from ozone pollution, those in urban areas, those without air conditioning, and those with lung diseases will be hurt most. If we cannot address climate change, frontline communities in low-lying areas will bear the brunt of it. If we cannot keep harmful chemicals out of children’s toys, families who buy cheaper products at dollar stores will be hurt most. And if we cannot protect people from unsafe drugs (FDA), contaminated food (USDA, FDA), occupational hazards (OSHA), chemical disasters (EPA, OSHA, DHS), dangerous vehicles (DOT) and unsafe consumer products (CPSC), we are all in trouble. This is about more than science. It is about protecting people using the power of science. We have everything to lose.

But we can take action. We can articulate the benefits of science to decisionmakers, the media, and the public. We can hold our leaders accountable for moves they make to dismantle science-based policy process. And we can support our fellow scientists both in and outside of the federal government. It starts with marching, but it cannot end here.

Science Just Saved My Daughter—The Most Important Reason Why I #StandUpForScience

UCS Blog - The Equation (text only) -

The morning of April 4, 2017 began with excitement. My family and I were ready to fly to Boston, where we were to meet up with friends and their children at the geography conference. Our five-month old baby Amaia had lost some weight and been all kinds of fuzzy over the last two weeks, so we stopped to see her doctor in the morning thinking she would get some antibiotics for a stomach bug and we would be on our way.

Instead, the doctor called the National Children’s Hospital in Washington, DC, and told them we were coming. Amaia had not nursed or soiled a diaper for too many hours, and her constant grunting told the doctor something was wrong. At the emergency room, things got jarring quickly. The blood, stool, and urine work did not reveal any infections or much else.

But when doctors ordered fluids and these kicked in, her grunting and difficulty breathing got much worse. At that moment, a lot of medical professionals started coming and going into our room. We got scared when we saw what looked like paramedics standing back ready for action, gloves on, ready to go.

It was obvious to us they were waiting for our Amaia to crash so they could jump and resuscitate her.

Cardiologists explained to us that Amaia had a rare condition called cor triatriatum, which means “three atria in the heart” (instead of the normal two!), a congenital malformation in her heart. An extra layer of heart tissue was making blood flow difficult, and worse, it was making fluids flow into the lungs. The surgeon told us very flatly that we either allowed her to be operated on to remove the tissue, give her a blood transfusion, and rebuild the septum, or her heart would collapse at any moment and she would not live.

There was no decision to make, no real choice in front of us. I know the hospital makes one sign consent forms because there are legal issues and people who have religious or other objections to blood transfusions or surgery. My wife believes in God and science; I believe in science and trust the medical professionals to do what they do best.

I looked each of them in the eye before they took her and saw confidence and professionalism. I pleaded to them silently to bring my girl safely back to me.

They did. Amaia spent 6 hours in the operating room. She came out around midnight and we saw her little, fragile but unfathomably resilient body fight for her life. She is now at home recovering quite nicely from her ordeal.

In retrospect, I’ve asked myself how doctors and nurses were able to diagnose and correct her certainly fatal heart malformation. The answer is science. Science built up over the centuries, with increasing medical knowledge, along with technology.

First, doctors conducted blood, urine, and stool analysis on Amaia to rule out viral infections or bacteria. Then X-rays of her chest revealed fluid in the lungs. An electrocardiogram (EKG) showed that the electrical signals of her little heart were off. The final proof of evidence and the “aha!” moment for doctors came with an echocardiogram (a Doppler image of the heart’s structure), which showed clearly that there was a third chamber in the left side of her heart.

A team of the best pediatric cardiologic surgeons, nurses, nurse practitioners, and anesthesiologists worked to install a cardiopulmonary bypass— essentially a pump—to reroute her heart’s blood, lower its temperature, stop it for a few hours, and operate to restore our daughter’s heart. She recovered in state-of-the-art cardiac intensive care and heart and kidney recovery units at the hospital, all made possible by scientific discoveries, technological developments, and the care and compassion of the medical personnel.

Being a social scientist, I’ve also asked myself why there was not a clear line of evidence and medical inquiry that led doctors straight to her condition. After talking to the medical personnel, I’ve come to the conclusion that in spite of the advanced state of today’s medical sciences, there is much more to research and understand so we can cure and manage more diseases.

You see, Amaia’s condition occurs only in 0.1 – 0.4 percent of all cases of heart disease. EKGs and echocardiograms done in utero during my wife’s pregnancy did not find anything wrong with her heart. These facts suggest to me that there are knowledge as well as technological limitations to our medical sciences.

These gaps in understanding can only be filled with more research, more funding, and more scientifically sound investigations. But the current administration has proposed to slash the budget for the National Institutes of Health (NIH)—the main federal medical research institute—by nearly 20 percent! Why is this important?

If the top research papers in an internet search are an indicator, it can be said that a lot of the research that made possible a surgical cure for Amaia’s heart disease was funded by the NIH.

There is more to government-funded science, of course, than the NIH. Worrisome proposed budget cuts have combined with political interference in science to create a toxic environment at other federal agencies that work to protect our health. President Trump has taken a “wrecking ball” approach at demolishing climate protections; his head of the EPA consistently denies the reality of climate change; and Trump’s racist and misogynist attacks on immigrants weaken both science and the social fabric of the United States that contributes to a fact- and evidence-based scientific culture.

I am not willing to stand by as science-based protections to air, water, soil, and tiny hearts like Amaia’s are compromised in the name of the special interests of polluting industries. That’s why I will march for science this Saturday April 22.

And I am not alone in this. Early reports are coming in that across the country and world, scientists, teachers, parents, workers, and more, are getting ready to highlight the value of science to public health and the environment and to stress that political interference and the wholesale disregard for protections to our health and environment are unacceptable.

Scientific understanding of the world at all scales—the microscopic, the human body, the planet—is needed to face the challenges that threatens us. Our baby Amaia survived her first trial due to the power of the women and men of science. We must all #StandUpForScience together.

No President Should Be Able to Start a Nuclear War Single-Handedly

UCS Blog - All Things Nuclear (text only) -

Among the general craziness of the 2016 presidential campaign, you can be forgiven if you missed one particular crazy piece of information: the president of the United States currently has the authority to order the launch of nuclear weapons without input from anyone. This has actually been the case for decades, but the campaign brought it to the attention of the general public, many of whom were hearing it for the first time and were understandably surprised, and even somewhat alarmed, at the idea.

Moreover, the president’s sole authority holds even if he is ordering a first strike—an attack that is not in response to a nuclear attack on the United States or its allies, but rather the first use of nuclear weapons, either in an ongoing conflict or to initiate a new one. This last case may be extremely unlikely, but there is currently no law that rules it out.

As former Vice President Dick Cheney said in a 2008 interview, the president “could launch the kind of devastating attack the world has never seen. He doesn’t have to check with anybody, he doesn’t have to call Congress, he doesn’t have to check with the courts.”

The usual policy for working with nuclear weapons requires that two people be present and in agreement before undertaking any procedure. The president should not be exempt from this rule when deciding to launch a first strike. Photo: mako

If, like me, you think this situation makes no sense, then there is some good news—Senator Ed Markey (D-MA) and Representative Ted Lieu (D-CA) are trying to change it. They recently introduced the Restricting First Use of Nuclear Weapons Act of 2017, which would prohibit the president from launching a first nuclear strike without a declaration of war by Congress. This common-sense step would prevent the president, the sole individual with the authority to launch U.S. nuclear weapons, from single-handedly starting a nuclear war.

While the 2016 campaign may have been the catalyst, the underlying problem is independent of who is in office. No single individual should have the authority to launch a nuclear war without extensive discussion, debate, and consideration of all the possible implications. As former Secretary of Defense William Perry has said, “a decision that momentous for all of civilization should have the kinds of checks and balances on Executive powers called for by our Constitution.”

Check out our fact sheet available on the Markey-Lieu bill, or read on for more information on why it is so important.

Current Situation

As it stands, the president could wake up tomorrow and simply notify the military that he had decided to order a nuclear strike. There is no requirement that he consult with anyone. He could choose to talk to advisers first, but whether or not he did, no one could stop him if he decided to go forward.

Once he made his decision, the president would use a card, often called the “biscuit,” that he or an aide carries at all times, to read a code to authenticate his identity to the senior officer on duty in the Pentagon’s “war room.” The war room would then prepare the order to send to launch crews on submarines and at command centers for land-based missiles. The time from when the president gives the launch order to when the crews receive it would be only minutes. Land-based missiles would be launched within about five minutes of the president’s order, while it might take about fifteen minutes for submarine-based missiles to launch. And once launched, these missiles cannot be recalled.

In 1974, President Richard Nixon—the last president to draw attention to the significant downsides of this system—noted, “I can go back into my office and pick up the telephone and in 25 minutes 70 million people will be dead.” Later that year in the thick of the Watergate scandal Nixon was emotionally unstable and drinking heavily, leading Secretary of Defense James Schlesinger to instruct the Joint Chiefs of Staff that “any emergency order coming from the president”—such as a nuclear launch order—should go through him or Secretary of State Henry Kissinger first. Schlesinger had no real authority to do so, however, and it is not clear what might have happened if such an order had actually come. The same would be true today—there is still no military or civilian official or group with the authority to countermand a presidential order to launch nuclear weapons.

What Would the Bill Do?

The Markey-Lieu bill states that “Notwithstanding any other provision of law, the President may not use the Armed Forces of the United States to conduct a first-use nuclear strike unless such strike is conducted pursuant to a declaration of war by Congress that expressly authorizes such strike.” The bill defines a first-use nuclear strike as “an attack using nuclear weapons against an enemy that is conducted without the President determining that the enemy has first launched a nuclear strike against the United States or an ally of the United States.”

Why Does the President Have Sole Authority?

The main reason the president has had sole authority to launch a nuclear strike is the perceived need to ensure a swift response to an incoming nuclear attack. During the Cold War, U.S. leaders feared a “bolt from the blue” attack by the Soviet Union, which could destroy US land-based missiles if they were not launched quickly. A decision about whether to launch a retaliatory strike would need to be made in the ten minutes or so between the time the incoming attack was detected, data analyzed and conveyed to the president, and when the missiles landed.

The nuclear command system was therefore designed for speed rather than deliberation. Its main purpose was to allow the president to launch U.S. nuclear weapons quickly, before they could be destroyed on the ground. This was always dangerous, and has long been unnecessary as well, since the U.S. has submarine-launched missiles that are invulnerable to such an attack and ensure that the U.S. can maintain a deterrent. UCS believes that the United States should end this risky practice by removing its land-based missiles from hair-trigger alert and eliminating rapid response options from its war plans.

However, until the president and the military agree to end these prompt launch options, the Markey-Lieu bill does not affect them. It also intentionally and specifically does not restrict the president’s ability to immediately order the use of U.S. nuclear weapons in response to a nuclear attack.

The situation the bill addresses is the decision to launch a first strike—when the United States is the first to use nuclear weapons against an adversary. In this case time constraints on decision making do not apply, so the streamlined decision process that might be needed in a retaliatory strike is not required. Regardless of whether the decision is to unleash a nuclear first strike as the first move in a conflict, or to use nuclear weapons first in escalating an ongoing conflict, the president would have time to consult with advisers and Congress before making such a potentially world-altering decision.

A Nuclear First Strike Is an Act of War

Make no mistake about it, as the bill states, “By any definition of war, a first-use nuclear strike from the United States would constitute a major act of war.” Nuclear weapons have unparalleled destructive power; their use would break a taboo of more than seventy years. The Constitution clearly establishes that the power to declare war belongs to the Congress alone. Therefore, this bill simply makes explicit an existing Constitutional requirement on the president.

Moreover, bringing Congress into the process would lessen the chance that such a decision could be made irrationally or impulsively. The decision to use nuclear weapons is potentially the most important decision this nation could make, with grave consequences for every citizen of the United States and the world. A decision to use them first should be undertaken only with the utmost caution and—especially in a democracy—should not be left up to any single individual.

Behind the Carbon Curtain: How the Energy Corporatocracy Censors Science

UCS Blog - The Equation (text only) -

In my forthcoming book, Behind the Carbon Curtain, The Energy Industry, Political Censorship and Free Speech (University of New Mexico Press), I tell the stories of scientists, artists and teachers who have been silenced by the collusion of energy corporations and public officials. My purpose is to provide witness, to record events, to give voice—and in so doing to shift the balance of power ever so slightly to bring us closer to a tipping point of outrage and change.

These stories and my analysis will not change society—at least not these alone. But maybe they will as part of a national narrative that includes the families in Pennsylvania driven from their homes by leaking methane, and whom energy companies compensate only in exchange for their silence. The nation’s story includes the citizens in West Virginia who were sued for libel by a coal company for criticizing the industry in a newsletter. And our country’s narrative involves the professor in the University of Oklahoma’s ConocoPhillips School of Geology and Geophysics who was intimidated into silence when an oil tycoon and major donor demanded the dismissal of scientists studying the link between fracking and earthquakes. Free speech is under attack by the energy industry across the nation.

I’d like to share a few vignettes from the varied and disturbing tales of censorship to provide a sense of what is happening in Wyoming and elsewhere.

A typical fracking operation requires 2 to 8 million gallons of water (along with 40,000 gallons of various, often toxic, chemicals, including acids, alcohols, salts and heavy metals). The outpouring of tainted waste water is dumped into lined evaporation pits. Behind the pit can be seen the drill rig and tanks that provide fracturing fluid for the drilling (photo by Ted Wood).

In 2001, Dr. Geoff Thyne was a research scientist in the University of Wyoming’s School of Energy Resources when he was contacted by a reporter from the Wyoming Tribune-Eagle who was investigating the development of an enormous gas field in southeastern Wyoming. When she asked Thyne how much water would be needed for fracking, he offered a range of figures based on the available scientific literature.

After the story came out, a University vice president notified School of Energy administrators that Noble Energy and the Petroleum Association of Wyoming were on the warpath. Thyne explained to the frenzied administrators that he’d, “made the comments based on my experience as a member of the scientific advisory board for the current EPA hydraulic fracturing study.”

At a meeting with university and corporate bigwigs, Thyne was ordered to write a full retraction. Mark Northam, the director of the School of Energy Resources, told Thyne: “I will edit your letter and you will sign it. You shouldn’t have said anything and don’t say anything ever again.”  Thyne relented to the director’s revisions, but the scientist refused to retract his estimates of water usage. Soon after, Thyne was fired and told that: “Mark Northam gets a lot of money from these oil companies and you are screwing with that.”

The Sinclair Oil Refinery in the eponymously named town of 450 stalwart souls. The Wyoming plant processes crude oil at a rate equivalent to the output of about ten fire hoses running 24 hours/day. In 2013, the Wyoming Occupational Safety and Health Administration levied a $707,000 fine for workplace safety violations—the largest such penalty in the state’s history (photo by Scott Kane).

In 2008, the University of Wyoming’s Office of Water Programs was headed by a committed climate change denier who dismissed the findings of the world’s leading experts by saying, “All these climate change models look like a bunch of spaghetti.” Director Gregg Kerr defended the fossil fuel industry by asking, as if this were a serious question, “Are we going to stop energy production and starve to death?”

He convinced the university that any mention of climate change was politically untenable. So Dr. Steve Gray, the state climatologist, met with fierce administrative resistance when he fulfilled his obligations to the people of Wyoming and spoke about climate change.

Eventually, Gray realized that “there was no chance to expand the program to better meet the State’s needs.” He left Wyoming for the US Geological Survey’s Climate Science Center in Alaska, where, Gray says, “It’s not hard for people to see the relevance of climate change when your village is falling into a river as the permafrost melts.” So it is that Steve Gray was the last state climatologist of Wyoming.

In 2014, nobody would’ve foreseen a problem with updating the Next Generation Science Standards, unless they were privy to emails from the chairman of the State Board of Education. Ron Micheli objected to the inclusion of climate change as “fact” rather than “theory” in the Next Generation Science Standards and he insisted that, “The ice pack is expanding [and] the climate is cooling.”

In the waning minutes of the spring legislative session, Wyoming’s politicians passed a budget footnote prohibiting the use of state funds to implement the science standards. The bill’s author explained that the standards treat “man-made climate change as settled fact… We are the largest energy producing state in the country, so are we going to concede that?” At issue was not the veracity of the science but the vitality of the energy companies. The governor defended the use of ideological indoctrination with a rhetorical question, saying: “Are the Next Generation Science Standards…going to fit what we want in Wyoming?”

We live in a time in which people take it to be normal that most everything is treated as a commodity—including speech. And in this frenzied marketplace, the energy industry has purchased academic positions, scientific questions, and classroom curricula.

But perhaps there’s hope. Prompted by years of legislative and corporate meddling, the editorial board of the Wyoming Tribune-Eagle [subscription required] put the situation into stark terms:

What is the value of academic freedom? That’s the question all Wyomingites should be asking themselves. To state lawmakers, it is a commodity that can be bought and sold, like coal or oil… What was once non-negotiable at UW now has a price tag on it. Lawmakers have sold the school to the highest bidder—the energy industry…

The journalists also incisively portrayed the nature of self-censorship, which may be the most insidious manifestation of oppression in the scientific community. There is no doubt that researchers simply decide not to pursue certain lines of inquiry, fearing retribution by legislators, CEOs and administrators. But my colleagues at the University of Wyoming have been adamant that they will take what comes, rather than asking me to be quiet. Living behind a carbon curtain of silence is too high a price to pay.

 

Bio:  Jeffrey Lockwood earned a Ph.D. in entomology from Louisiana State University and worked for 15 years as an insect ecologist at the University of Wyoming.  In 2003, he metamorphosed into a Professor of Natural Sciences & Humanities in the department of philosophy where he teaches environmental ethics and philosophy of ecology, and in the program in creative writing where he is the director and teaches workshops in non-fiction.  His writing has been honored with a Pushcart Prize, the John Burroughs award and inclusion in the Best American Science and Nature Writing.  You can follow his work through his website, Facebook, and Twitter.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

Sustainable Agriculture on the Chopping Block in Iowa

UCS Blog - The Equation (text only) -

There has been unsettling news out of my former home over the last week, as the Iowa legislature plays politics with critical scientific research in the state. In the closing days of the legislative session, two budget bills moved swiftly that could force the closing of the Leopold Center for Sustainable Agriculture, a nationally recognized center for sustainable agriculture research. There were also threats to a research center dedicated to mitigating flood impacts (which I wrote about last year for its excellent forecasting that literally helped saved lives), but that appears now to be safe.

A little bit of background: the Leopold Center was established in 1987 by Iowa’s Groundwater Protection Act. This law passed as the farm crisis of the 1980’s was raging (it is estimated that nearly one-third of the state’s farms went out of business) and there was growing recognition of the problems associated with soil degradation and water pollution. Forward-thinking Iowa legislators came up with a funding stream – a small fertilizer and pesticide tax that generates several million dollars a year – to be dedicated to research on alternatives that offset the economic and environmental impacts of agriculture.

The resulting funding stream launched several important research enterprises—for example, a center studying health effects of environmental contaminants at the University of Iowa, long-term agricultural research sites across the state, as well as the Leopold Center, which is based at Iowa State University. Since that time, the Leopold Center’s competitive grants program has funded research that benefits both rural and urban constituents, with projects that range from local food infrastructure to crop diversification to beginner farmer programs. Many of the innovative topics the Center has investigated are now widely accepted largely thanks to its efforts, so it’s important to recognize how critical this type of rare funding support is for encouraging and spreading transformative ideas.

Research far and wide has benefited from the Leopold Center

The Leopold Center for Sustainable Agriculture supported two research projects of mine related to cover crops while I was a graduate student at Iowa State University. It was invaluable professional development as an early career scientist, and I know that many other students had similar experiences that wouldn’t have been possible without these critical funds.

The Leopold Center’s research not only supports progress at the state level, but also has direct application to progress on a national level.

Our own research here at the Union of Concerned Scientists has benefited from the Leopold Center’s novel work. In our 2016 report, Growing Economies, we evaluated the economic impact of more local food purchasing in the state of Iowa. We were able to do that using survey data generated by the Leopold Center, in which institutional and intermediate food purchasers were asked about their ability to support local food. And in Subsidizing Waste, we calculated the economic impact of scaling up the integration of perennial vegetation into corn and soybean fields, to save money on water clean-up costs. The STRIPs project has long been supported by funding from the Leopold Center. Finally, a report we’re preparing to release next month will detail how a crop rotation system developed at Iowa State and supported by the Center could be expanded, spreading economic and environmental benefits across the state and the Corn Belt.

Also, earlier in my career while I was a Ph.D. student at Iowa State University, I received two Leopold Center research grants to study the long-term impacts and farmer adoption of cover crops. That was an invaluable professional development opportunity for me as an early career scientist: from developing the proposal to helping administer the project and to making decisions on dollars spent.

If a research center like this disappears, it would be yet another significant blow in the broader conversation over how much funding goes toward sustainable agriculture. In a recent analysis, we looked at competitive grants program within the USDA, concluding that agroecological research (similar to projects supported by the Leopold Center) is woefully underfunded, with less than 15 percent of funding going to projects that included any element of this type of work. We need more of this type of research, not less, and nearly 500 Ph.D. level scientists agree.

Lawmaker claims “mission accomplished” in sustainable agriculture (LOL!)

An Iowa state representative this week in an interview claimed: “A lot of people felt that the mission for sustainable agriculture that [the Leopold Center] undertook, that they have completed that mission.” The same lawmaker also claimed that sustainable agriculture research at Iowa State can continue, but through other channels. These comments either suggest an utter lack of understanding around the reality of sustainable agriculture, or otherwise reveal the politics fueling these budget bills.

The agriculture and natural resources committee budget bill directs the Leopold Center to shut its doors this summer, and directs their funds to another center at Iowa State University. The other center does not currently have a track record of transparently administering research dollars, and has a far narrower scope than the current vision of the Leopold Center.

Comments to the tune of “someone else will do the research” always give me pause. The common thread I’ve noticed is that research deemed duplicative or unnecessary often simply doesn’t jibe with financial interests. It is easy to see that research describing less use of pesticides, for example, might be viewed as controversial to powerful business interests. (Many examples of this already exist!)

Further, to claim “mission accomplished” on sustainable agriculture is laughable, and hints at willful ignorance about the current economic and environment realities in Iowa. They bear similarities to the 1980s: soil erosion and water pollution remain persistent and costly challenges, and farm incomes have been steeply declining for several years.

Research should be free of interference even when the politics are thorny

Even though it might not be popular for those with a financial stake in the status quo, the  research made possible by the Leopold Center plays a critical role in the future of the state, if not the nation, and has broad public support. So it’s hard not to see this incident as part of the larger political attacks on science, with parallels to the Trump Administration’s numerous attacks on climate action.

In addition to research funds, the Leopold Center supports a diverse dialogue by bringing in valuable speakers and lectures to Iowa State’s campus; I shudder to think how that important dialogue will change if the state legislature votes to close its doors. The Center has a successful and important track record benefitting local and national public interests, and I hope it stays that way.

Chevron Denies Climate Risk to Shareholders While Supporting the Spread of Climate Disinformation

UCS Blog - The Equation (text only) -

In preparation for its annual shareholders’ meeting next month, Chevron Corporation has issued its 2017 Proxy Statement. Unfortunately for investors concerned about climate change, this major oil and gas company continues to downplay the profound risks its product poses to Earth’s climate.

In a new report cited in the proxy statement, Chevron insists that its risk exposure in a carbon-constrained world is minimal, although it acknowledged in annual financial filings the increased possibility of climate-related investigations and litigation. In the proxy statement, the company’s Board attempts to convince shareholders that Chevron’s political activities—which include support for groups that spread climate disinformation—are in shareholders’ long-term interests.

Pressure continues to mount on major fossil fuel companies like Chevron to renounce disinformation on climate science and policy and begin to plan for a world free from carbon pollution. Here’s a preview of some of the key climate-related issues on the agenda at Chevron’s annual meeting on May 31 in Midland, Texas.

Systemic economic risks

There is substantial backing in the business and investor communities for strengthening and harmonizing climate-related financial disclosures by companies in all sectors. The Financial Stability Board (FSB) is an international body that monitors and makes recommendations about the global financial system. Recognizing the potential systemic risks posed by climate change to the global economy and economic system, the FSB set up a Task Force on Climate-Related Financial Disclosures (TCFD) chaired by former New York City mayor Michael Bloomberg.

In December 2016, the TCFD released its Recommendations Report. The TCFD recommended disclosure of climate-related financial risks in mainstream (i.e., public) financial filings. Specifically, the TCFD recommended that companies disclose what a 2° Celsius scenario would mean for their businesses, strategies, and financial planning.

The Union of Concerned Scientists and other stakeholders have provided comments on the TCFD’s Recommendations Report, which is expected to be taken up by the leaders of G20 countries this year.

Now is a good time to consider how Chevron’s 2017 financial filings measure up to these mainstream recommendations and expectations.

2°C scenario planning

Increasingly, shareowners of major fossil energy companies are calling for annual reporting on how climate policies may affect their business in light of the globally agreed target to limit global warming to 2°C above pre-industrial levels. This year, Chevron shareholders have put forward a resolution calling for annual planning on 2°C scenarios, along with a proposal on transition to a low-carbon economy. In 2016, the 2°C scenario planning resolution won the support of more than 40% of Chevron shareholders.

In our inaugural Climate Accountability Scorecard released last October, UCS assessed how Chevron is planning for a world free from carbon pollution—and scored the company “poor.” We recommended that Chevron:

  • Publicly acknowledge the Paris climate agreement’s long-term goal and its implications for the swift transition to global net-zero emissions;
  • Disclose emissions resulting from the company’s operations and the use of its products;
  • Set and disclose initial near-term company-wide targets to reduce emissions from its operations and the use of its products;
  • Develop and publicly communicate a clear plan and timeline to deepen emissions reductions consistent with the Paris agreement’s long-term goal.

In the 2017 proxy statement, Chevron’s board recommends a “no” vote on both the 2°C scenario planning and transition to a low-carbon economy proposals. The Board asserts that Chevron’s report “Managing Climate Change Risks: A Perspective for Investors,” released last month, substantially addresses the issues raised by its shareowners in these resolutions, despite a lack of such disclosures in the company’s U.S. Securities and Exchange Commission (SEC) reporting.

This report comes to some extraordinary conclusions, including:

  • “…Chevron’s current risk management and business planning processes are sufficient to mitigate the risks associated with climate change.”
  • “…the current risk exposure to the Company even in a restricted GHG [greenhouse gas] scenario is minimal.”

Yet “Managing Climate Change Risks” is not a robust analysis of the potential business, strategic, and financial implications of climate-related risks and opportunities for Chevron. It fails to meet the expectations of shareholders or align with the TCFD recommendations. Among its shortcomings, the report:

  • Provides no description of how 2°C scenario analysis is integrated into Chevron’s investment decision making or strategic business planning;
  • Includes limited discussion of market and technological risks to Chevron’s business model and provides no detail on its current or projected low-carbon investments;
  • Does not acknowledge climate-related risks to Chevron’s reputation;
  • Provides some information about Chevron’s assessment of climate-related physical risks, but does not disclose how it determines their materiality or how it plans to manage these risks in the future.
Potential investigations and litigation

In contrast to the rosy outlook presented in “Managing Climate Change Risks,” Chevron’s annual 10-K report for 2016 (filed in February 2017) acknowledged that “Increasing attention to climate change risks has resulted in an increased possibility of governmental investigations and, potentially, private litigation against the company.” This is an extraordinary admission, for several reasons:

  • Companies only have to disclose to investors risks that could have a “material adverse effect” on them—that is, risks to their bottom lines. Most companies resist identifying a risk as material until they have no choice but to do so.
  • To date, only ExxonMobil is known to be under investigation by governmental authorities—specifically, the attorneys general of New York and Massachusetts and the SEC. Chevron is apparently concerned that it, too, could face scrutiny for misleading investors and consumers about climate change.
  • Up until recently, Chevron has been limited in its climate-related risk disclosure to the SEC. After pressure from UCS and investors, the company did expand its disclosure of physical risk at its refineries, but didn’t explicitly mention climate change like this year’s disclosure does.

In last year’s Climate Accountability Scorecard, Chevron scored only “fair” on fully disclosing climate risks to its shareholders. It remains to be seen whether these additional disclosures will improve Chevron’s score in this area.

Direct and indirect lobbying

Growing numbers of investors are also seeking more information to assess whether lobbying by major oil and gas companies is consistent with the companies’ expressed goals and in the best interests of shareholders. Chevron again faces a proposal from shareholders requesting annual reporting on direct and indirect lobbying activities and expenditures.

In 2016, 27% of Chevron shareholders voted for such a proposal, and the company has not taken any steps that are likely to reduce shareholder concerns in this area.

This year’s resolution cites Chevron’s lack of transparency about its membership in and contributions to trade associations and industry groups such as the American Petroleum Institute (API), Western States Petroleum Association (WSPA), Business Roundtable, US Chamber of Commerce (US Chamber), and American Legislative Exchange Council (ALEC).

UCS’s Climate Accountability Scorecard rated Chevron “egregious” in the area of Renouncing disinformation on climate science and policy, due largely to its affiliation with groups like API, WSPA, the US Chamber, and ALEC that spread disinformation on climate science and policy.

Shareholders should not tolerate Chevron’s efforts to dismiss and deny the very real risks posed by climate change to our planet, to the company’s business model, and to their investments. They can send a strong message to the company’s management and board by voting in favor of climate-related shareholder proposals at next month’s annual meeting.

Why One Midwestern Scientist Will March for Climate Justice in Washington

UCS Blog - The Equation (text only) -

With the Trump Administration’s recent attacks on climate policy, the proposed cuts to the EPA’s budget, and numerous attacks on science it’s no surprise that people are outraged and want to stand up for science and fight for climate justice.

That’s why the Union of Concerned Scientists joined the Steering Committee of the People’s Climate Movement, which is a project of dozens of organizations working together to solve the climate crisis. The People’s Climate March will be held on Saturday, April 29 in Washington DC which marks Donald Trump’s 100th day in office. We must push back against the Trump administration’s agenda and at the same time push forward on our vision of a cleaner and safer world.

So why should scientists and science supporters attend the march? To answer that question I recently interviewed UCS Science Network Member Tim Gerrity for his take on how scientists and others can get involved to fight for climate action. The interview is recorded below.

 Jessica: Thanks for talking with me today, Tim. Can you tell me a little about yourself and your work?

Tim: I have a long background in different areas of scientific research. I have a Ph.D. in Physics from the University of Illinois at Chicago and I’m currently a medical technology consultant. I was previously the Chief of the Clinical Research Branch/Health Effects Research Laboratory at the US Environmental Protection Agency (EPA). At the EPA, I researched the acute health effects of air pollutants. In addition to my research, I provided scientific input to the EPA on the quality Criteria Documents required under the Clean Air Act. I have a deep seated concern for the protection of the environment and the setting of standards to protect human health.

 Jessica: Why are you attending the People’s Climate March on April 29?

 Tim: First, climate change is a fact, it is caused by humans, and we must act to protect the environment and human health. Last month, the National Academies of Science, Engineering, and Medicine released a report showing that increased intensity and frequency of extreme weather events like floods, heat waves and droughts are influenced by human-induced climate change. I care as a citizen, a human being, and as a scientist.

Second, I am attending the march to fight back on the current attacks on science and Donald Trump’s harmful statements questioning the validity of climate change. Climate change is happening, and since it is human caused, we may be able to mitigate the worst impacts of it by taking actions to reduce greenhouse gas emissions. President Trump and Congress must fight to limit carbon emissions. Trump’s recent Executive Order on “Promoting Energy Independence and Economic Growth”, seeks to unravel critical public health and climate protections, including the Clean Power Plan. Additionally, with the proposed cuts to the US EPA’s budget, scientific research is threatened, as well as the health and safety of all Americans.  I was a witness to similar budget cuts to the U.S. EPA during the Reagan administration, and research was targeted. Scientific research is crucial for our government to make informed and unbiased policy decisions.

Jessica: Why is it important for scientists, like you, to engage in public policy at the state and federal level?

Tim: Engaging in policy is extremely important because scientists must inform the political establishment and the broad public on the implications of science in driving policy.  Not just in the area of climate.  I am very concerned about rollbacks in various areas of research that have impacts on human health and reflect a misunderstanding of science and the use of science to benefit society.  Science is essential for government policymaking.

Jessica: We are actively working to increase clean energy in the Midwest. Do you think our current political climate threatens the development of clean energy in the region?

Tim: No, but we have a lot of work to do. Educating policymakers and the broader public is vital. President Trump claims he is going to bring back coal jobs, but the truth is, there are now twice as many solar jobs as coal jobs in the United States. We know that coal is becoming less and less competitive as a source of energy, and power companies want to get away from it for economic reasons. The notion that we are going to bring back jobs is an ill-informed notion and it’s cruel. At the same time Trump’s proposed budget is cutting job training programs for coal communities. Our nation’s power sector is already rapidly transitioning away from coal and toward cleaner energy sources such as wind and solar, which has experienced record growth in recent years.

Jessica: At UCS, we’re encouraging our members to get involved and take action and demand climate justice. What would you say to encourage other folks to attend the Peoples Climate March on Saturday, April 29 in DC?

Tim: I have never in my life seen such a dramatic reversal in thought and understanding on the part of government leadership on air quality and the health of the people and the planet. We are going to hit a point where there is no return. It’s easy if you aren’t going to be around in 2100 when we could see some of the most dramatic effects, but we can’t ignore it. Generations to come will look back at us and judge us by our actions today. Climate change is one of the most important societal issues nationally and globally. And the United States cannot pull out of the Paris Climate Agreement. We need to do more for the public and public health that is based on scientific fact. It is the job of the federal government to protect human health. We need to think of this not as regulations but as human health protections.

How to get involved

Join us on April 29th as we march for climate justice and march to protect our communities. UCS will be chartering a bus from Chicago to Washington DC, and there are still spots left—reserve your spot today!  You can register to attend the People’s Climate March here.

Photo: Michael O'Brien/CC BY-NC (Flickr)

Restoring California’s Coastal Ecosystems

UCS Blog - The Equation (text only) -

Over two-thirds of Californians live in coastal counties. Californians love their coastline for good reasons—the mild weather, recreational opportunities, and of course their iconic beauty and natural diversity.

The California coastline hosts a variety of ecosystems ranging from sand dunes to rolling grasslands to mixed evergreen forests. These ecosystems not only are beautiful and provide habitat to many species of plants and animals, they also provide important services to people. Coastal wetlands, for example, help to improve water quality, reduce shoreline erosion, and buffer against sea level rise.

Mission Bay Wetlands in San Diego. Photo by Joanna Gilkeson/USFWS.

But the millions of Californians who live near the coast have had significant impacts on these ecosystems. Less than 10 percent of original wetland habitat remains. Likewise, the forces of urbanization and agriculture have made California’s coastal grassland and scrub ecosystems among the most endangered in the nation. The challenge is finding the balance between meeting the needs of people and conserving these ecosystems and the many species that depend on them, including humans.

Valuing, conserving, and restoring our coastlines

Example of sand dune ecosystem. Photo: K. Holl.

Fortunately, California has visionary leaders and a general population that has recognized the need to protect the coast for future generations. In 1972, voters passed an initiative to establish the California Coastal Commission, which was tasked with balancing development and protecting coastal resources. Californians continue to recognize the importance of coastal ecosystems, as we saw in the June 2016 election: 70 percent of voters in nine San Francisco Bay Area counties approved a $12 parcel tax that will provide an estimated $500 million to support wetland restoration efforts over the next 20 years.

Conserving remaining intact ecosystems must be the first priority. But ecological restoration is also an important component of conservation efforts, especially where there has been extensive habitat conversion and degradation, as in many areas of coastal California. The question is how to restore coastal ecosystems in an ecologically appropriate and cost-effective manner. This is where the work of my students, my collaborators, and me plays an important role.

Improving restoration success

Developing methods to restore ecosystems starts by documenting what is out there. How degraded are the hydrologic and soil conditions? Which species are missing entirely? If left alone for a few years, will the site recover on its own? If not, will changing the management regime favor native species?

For example, our coastal grasslands host approximately 250 native wildflower species, many of which are now threatened or endangered due to habitat loss and competition with tall-stature invasive grasses, primarily from Europe. My lab has studied how different management regimes, such as grazing and fire, can be used to help restore native wildflowers. Our results show that properly-managed cattle grazing can help to increase the density of a number of wildflower species.

Much of my research aims to develop restoration methods that are practical and safe for humans. To do this, I work with land managers at government agencies like California State Parks, private land trusts, and other groups to understand their challenges and identify research questions they need answered. For example, herbicides are widely used in many coastal restoration projects to control invasive plant species prior to planting native species. But, there is growing concern about the effects of herbicides on the health of those who apply them and on nearby communities. Hence, we have been testing various non-chemical methods of invasive control, measuring not only their ecological effectiveness but also costs, to evaluate whether alternative methods would be practical at a larger scale.

Training the next generation of environmental leaders

Students learning at the UC Natural Reserve System. Photo: K. Holl

As a professor at the University of California, one of my most important roles is training the next generation of environmental leaders. Therefore, both undergraduate and graduate students are an integral part of my research. Each year, the University of California Natural Reserves staff and I work with 50-60 students doing hands-on restoration research and implementation. This gives students an opportunity to develop both critical thinking and practical job skills. We aim to ensure that the students involved in these projects reflect the diversity of the state. We know that low-income and minority communities are disproportionately affected by negative environmental impacts, but they are generally under-represented in ecology. We offer introductory field courses for students who have not had ample opportunities to study outdoors, and we are raising funds for paid internships so they can gain these important job skills and contribute to the growing restoration economy.

My goals are to do research that improves how we restore coastal ecosystems and to provide educational opportunities for learners of all ages. My hope is that together we can conserve California’s amazing coastal ecosystems for future generations.

 

Karen Holl (holl-lab.com) is a professor of environmental studies at the University of California, Santa Cruz. She is a leader in the field of restoration ecology and the faculty director of the Norris Center for Natural History. You can watch a short video on her grassland restoration research here.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Beef, Palm Oil and Taking Responsibility: A Comment That TheOilPalm Wouldn’t Publish

UCS Blog - The Equation (text only) -

Back in December, I wrote a blog post about the importance of beef as the largest driver of deforestation. The following month, the Malaysian Palm Oil Council wrote a blog on their site, TheOilPalm.org, arguing that my blog proved that palm oil had been unfairly blamed for deforestation, and demanding an apology. Here’s a comment explaining why they’re wrong:

“When I read the post by the Malaysian Palm Oil Board concerning my blog about the importance of beef as the leading driver of deforestation, I recalled a lesson that I learned many, many years ago. I’m now 67 years old, which means that it has been more than six decades since my parents taught it to me. It was simple: when I did something wrong, I couldn’t excuse it by saying that someone else had done something worse. I had to take responsibility for my own actions, no matter what anyone else did.

As I explained in my original blog, new data shows the large role of beef production, particularly in Latin America, as a cause of tropical deforestation. Does this mean that we no longer need to be concerned about deforestation for oil palm production in Malaysia? Does the climate impact of deforestation in the Amazon mean that the destruction of peat swamps in southeast Asia no longer causes any global warming pollution? Does the threat to jaguars and tapirs in South America somehow protect orangutans and rhinos on the other side of the planet?

Of course not. The threats to the environment, the climate and biodiversity from oil palm production in Malaysia are not diminished in the least by the parallel threats from beef production in the Americas. One does not excuse the other. On the contrary, they combine to make the global danger even worse.

This kind of argument is similar to something we’ve been seeing in recent weeks in Washington, which goes by the name “what-about-ism.” When the new government does something egregious on one issue, instead of defending its actions it responds by attacking its critics on some other issue. For example: the courts have found the current administration’s ban on immigrants from Muslim countries to be unconstitutional—well, what about the previous administration’s deportations of immigrants from Mexico?

Few of us have found this kind of blame-shifting persuasive, and I doubt the Malaysian Palm Oil Board’s arguments about beef will be any more convincing. Environmental destruction in one part of the world doesn’t justify it in any other part of the world, whether it’s larger, smaller, or simply different. The destruction of tropical forests by all the drivers of deforestation—beef, palm oil, soy and timber—is a threat to the climate that we all depend on, and thus to people everywhere.”

You may wonder why this comment is posted here rather than on the MPOC web site to which it’s replying. The answer is, because they wouldn’t post it. I submitted this comment on their blog site on Monday, March 13, in full anticipation that it would be published immediately, and when it wasn’t, I sent a followup message two days later asking what was causing the delay. But it’s now a month later and nothing has happened. The comment hasn’t been posted, nor has there even been the courtesy of a reply. That’s why it’s here.

How Clean Are the Newest EVs?

UCS Blog - The Equation (text only) -

How clean are the newest EV models? As we’ve shown before, an EV is cleaner than the average gasoline car. But, the global warming emissions savings from using an electric vehicle depend in part on where in the U.S. you live.

We have an online tool that lets you compare most of the EVs that have been sold over the last six years, and we’re continually updating our database with the latest models.

Here are just some of the latest additions that you can choose to analyze using the tool:

Chrysler Pacifica Hybrid

Chrysler is marketing the Pacifica as a ‘hybrid’, even though it’s actually better: it’s a plug-in hybrid. Photo: Dave Reichmuth

Chrysler is marketing the Pacifica as a ‘hybrid’, even though it’s actually better: it’s a plug-in hybrid.The new Chrysler Pacifica minivan is being advertised as available with merely a hybrid drivetrain, but it’s actually a plug-in hybrid.

This seven-passenger van is rated at over 30 miles of electric range and then gets over 30 MPG when the gasoline engine is being used. This combination makes the Pacifica the cleanest minivan option by a long shot, as the best gasoline models only get 22 MPG. Its maker, Fiat Chrysler America, was one of our worst rated companies for commitment to EVs, so hopefully this means that they are starting to get on the right path.

Hyundai Ioniq BEV

The Ioniq is the first vehicle to sold as a conventional hybrid, plug-in electric hybrid, or a battery electric model. Photo: Ki Hoon. CC-BY-SA-4.0 (Wikimedia).

 Hyundai is actually releasing three versions of the Ioniq: an all-electric version, a plug-in hybrid, and a gasoline-only ‘conventional’ hybrid.

The first one to reach the U.S. is the all-electric Ioniq BEV. With a range of 124 miles, this car could meet many driver’s daily needs.  The Ioniq also boasts the highest efficiency of any electric car on the market (0.25 kWh per mile).  That means in places with cleaner electricity, this car produces emissions equal to a gasoline car rated at 100MPG or better.

Prius Prime

The 2017 Prius Prime can be plugged into any regular outlet to charge its 8.8 kwh battery pack. Photo: Toyota News Room.

This is the second version of Prius plug-in hybrid, but it’s vastly different from the previous version. While the first version could only operate in electric-only mode at low speeds, the Prime can go all-electric under most conditions.

The range has also more than doubled to 25 miles. Because it’s based on an efficient hybrid, the Prime also get exceptional gas mileage when running on gasoline at 54 MPG.

Chevy Bolt

The Chevrolet Bolt EV is the most efficient and affordable long-range EV currently available. Photo: Dave Reichmuth

The Chevy Bolt EV (easily confused with Chevy Volt) is a new all-electric hatchback that is the first EV to get over 200 miles range that is not made by Tesla. While not as efficient as the Ioniq, it is the most efficient long-range battery electric vehicle available at 0.28 kWh/mile. Because of its range, it could potentially replace more gasoline-powered trips than other EVs, leading to greater emissions reductions.

Check out these EVs, as well as all the other EV models available using our tool. We’ll continue to add EVs to our emissions tool, including the anticipated new long-range EV models from Tesla and Nissan later this year. We’ll also add the latest electricity emissions estimates, so watch this space for updates.

 

 

 

 

Columbia Generating Station: NRC’s Special Inspection of Self-Inflicted Safety Woes

UCS Blog - All Things Nuclear (text only) -

Energy Northwest’s Columbia Generating Station near Richland, Washington has one General Electric boiling water reactor (BWR/5) with a Mark II containment design that began operating in 1984. In the late morning hours of Sunday, December 18, 2016, the station stopped generating electricity and began generating problems.

The Nuclear Regulatory Commission (NRC) dispatched a special inspection team to investigate the event after determining it could have increased the risk of reactor core damage by a factor of ten. The NRC team sought to understand the problems occurring during this near-miss as well as assess the breadth and effectiveness of the solutions proposed by the company for them.

Trouble Begins Offsite

The plant was operating at full power when the main generator output breakers opened at 11:24 am due to an electrical transient within the Ashe substation. The Ashe substation is owned and maintained by the Bonneville Power Authority and serves as the connection between electricity produced at the plant and the offsite power grid. At least three electrical breakers at the Ashe substation were supposed to have opened to de-energize the faulted transmission line(s). Had they done so, the loss of the transmission lines could have triggered protective devices at the Columbia Generating Station to automatically trip the main generator. But cold weather kept the breakers from functioning properly. Instead of the protective systems at the Columbia Generating Station responding on a system level (i.e., the de-energized transmission line(s) triggering a main generator trip), they responded at the component level (i.e., the main generator output breaker sensed the electrical transient and opened).

The turbine control valves automatically closed because the main generator was no longer fully loaded with its output breakers opened. The closure of the turbine control valves automatically tripped the reactor. The control rods fully inserted within seconds to stop the nuclear chain reaction. The output breakers, turbine control valves, and control rods all functioned per the plant’s design (see Figure 1).

Fig. 1 (Source: Nuclear Regulatory Commission annotated by UCS)

Before the trip, the main generator was producing electricity at 25,000 volts. The main transformer increased the voltage up to 500,000 volts for transmission out to the offsite power grid. The auxiliary transformers reduced the voltage to 4,160 volts and 6,900 volts for supply to equipment in the plant. The output breakers that opened to start this event are represented by the square box in the upper left corner of Figure 2.

Fig. 2 (Source: Nuclear Regulatory Commission annotated by UCS)

Trouble Begins Onsite – Loss of Heat Sink and Normal Makeup

The main generator was disconnected from the offsite power grid but continued to supply electricity through the auxiliary transformers to plant equipment. Because steam was no longer flowing to the turbine, the voltage and frequency of the electricity dropped. The voltages flowing to in-plant equipment dropped low enough to cause electrical breakers to automatically open at 11:25 am to protect motors and other electrical equipment from damage caused by under-voltage. For example, an electric motor requires an electrical current of a certain voltage in order to operate. Electrical current of lower voltage may not be enough to enable the motor to run, but that current flowing through the motor may be enough to heat it up and damage it. One of the de-energized loads caused the Main Steam Isolation Valves (MSIVs) to close. Their closure meant that steam produced by the reactor’s decay heat no longer flowed to the condenser where it got cooled by water from the plant’s cooling towers. Instead, the steam bottled up in the reactor vessel and piping until it increased the pressure to the point where the safety/relief valves opened to discharge steam to the suppression pool (see Figure 3).

The closure of the MSIVs also stopped the normal flow of makeup cooling water to the reactor vessel. The feedwater system uses steam-driven turbines connected to pumps to supply makeup cooling water to the reactor vessel. But the steam supply for the feedwater pumps is downstream of the now-closed MSIVs. The condensate and condensate booster pumps upstream of the feedwater pumps have electric motors and continued to be available. But collectively they only pump water at about two-thirds of the pressure inside the reactor vessel, meaning they could not supply makeup water unless the pressure inside the reactor vessel decreased by nearly one-third its normal pressure.

Fig. 3 (Source: Nuclear Regulatory Commission annotated by UCS)

Troubles Onsite Grow – Loss of Normal Power for Safety Buses

At 11:28 am, the safety buses SM7 and SM8 tripped on low voltage, causing their respective emergency diesel generators to start and provide power to these vital buses. This was not supposed to happen during this event. By procedure, the operators were directed to manually trip the turbine and generator following the automatic trip of the reactor. They tripped the turbine at 11:27 am, but never tripped the main generator. Tripping the main generator as specified in the procedures would have immediately caused electrical breakers to close and other electrical breakers to open to swap the supply of electricity to plant equipment from the auxiliary transformers to the startup transformers as shown in Figure 4. The startup transformers reduce 230,000 volt electricity from the offsite power grid to 4,160 volts and 6,900 volts for use by plant equipment when the main generator is unavailable. With electricity to plant equipment from the startup transformers, the MSIVs would have remained open and makeup cooling water supplied by the feedwater pumps as normally provided.

Fig. 4 (Source: Nuclear Regulatory Commission annotated by UCS)

Even More Trouble Onsite – Loss of Backup Makeup

The operators manually started the Reactor Core Isolation Cooling (RCIC) system (not shown on the Figure 3, but a smaller version of the High Pressure Coolant System) at 11:32 am to provide makeup cooling water because the feedwater system was unavailable. The RCIC systems’ primary function is to supply makeup cooling water when the feedwater system cannot do so. Like the feedwater pumps, the RCIC pump is connected to a steam-driven turbine. Unlike the feedwater pumps, the RCIC pump’s turbine is supplied with steam from the reactor vessel through a connection upstream of the closed MSIVs. The RCIC pump transfers water from a large storage tank to the reactor vessel.

The operators failed to follow the procedure when starting the RCIC system. The procedure called for them to close the steam admission valve (V-45) and then open the trip valve (V-1) as soon as V-45 was fully closed (see Figure 5). But they did not open V-1. The failure to open V-1 disabled the control system designed to bring the RCIC turbine up to desired speed in 12 seconds. Instead, the RCIC turbine tried to obtain the desired speed instantly. Too much steam too soon caused the RCIC turbine to automatically trip on high speed. This trip guards against the spinning turbine blades coming apart due to excessive forces.

It took about 13 minutes for workers to go down into the RCIC room in the reactor building’s basement and reset the mis-positioned valves to allow the system to be properly started. In that time, the water level inside the reactor vessel dropped about a foot as it boiled away. That still left 162 inches (13.5 feet) of water above the top of fuel in the reactor core. The operators had several hours to restore makeup cooling water flow before the reactor core started uncovering and overheating.

Fig. 5 (Source: Nuclear Regulatory Commission annotated by UCS)

The operators manually started the High Pressure Core Spray (HPCS) system at 12:09 pm to provide makeup cooling water with the feedwater and RCIC systems both unavailable. The main HPCS pump (HPCS-P-1) has an electric motor. The pump transfer water from the large storage tank to the reactor vessel. While RCIC is designed to supply makeup water to compensate for inventory boiled off after the reactor shuts down, the HPCS system is designed to also compensate for water being lost through a small-diameter (about 2 inches) pipe that drains cooling water from the reactor vessel. Consequently, the HPCS system flow rate is about ten times greater than the RCIC system flow rate. And whereas the RCIC system flow rate can be throttled to match the makeup need, the HPCS system makeup flow is either full or zero.

The HPCS system refilled the reactor vessel soon after it was started. The operators closed the HPCS system injection valve (V-4) after about a minute. The minimum flow valve (V-12) automatically opened to direct the pump flow to the suppression pool instead of to the reactor vessel (see Figure 6). The HCPS system ran in “idle” mode for the next 3 hours and 42 minutes.

Fig. 6 (Source: Nuclear Regulatory Commission annotated by UCS)

Yet More Trouble Onsite – Water Leaking into Reactor Building

On December 18, workers discovered that the restricting orifice (RO) downstream of V-12 had leaked an estimated 4.7 gallons per minute into the reactor building while the HPCS system had operated. The NRC team learned that the gasket material used in this restricting orifice had been the subject of an industry operating experience report in 2007. A condition report was written at Columbia Generating Station in 2008 to have engineering assess the operating experience report and gasket materials used at the plant. In early 2010, the condition report was closed out based on engineering’s evaluation to use the gasket material recommended in the industry report. But the “bad” gaskets were not replaced.

Operating experience cited in the 2007 industry report revealed that the original gasket material was vulnerable to erosion. The report described two adverse consequences from the material’s erosion. First, pieces of the gasket could be carried by the water into the reactor vessel where the material impacting the fuel rods could damage their cladding. Second, gasket erosion could allow leakage. The 2007 industry report thus forecast the problem experienced at Columbia Generating Station in December 2016. The solution recommended by the 2007 report was not implemented until after the forecast problem has occurred.

NRC Sanctions

The NRC’s special inspection team identified three safety violations at the Columbia Generating Station. Two violations involved the operators failing to follow written procedures: (1) the failure to trip the main generator which resulted in the unnecessary closure of the MSIVs, and (2) the failure to properly start the RCIC system which resulted in the unnecessary trip of its turbine. The third violation was associated with the continued use of gasket material determined nearly a decade earlier to be improper for this application.

UCS Perspective

Self-inflicted problems turned a fairly routine incident into a near-miss. Luck stopped it from progressing further.

The problem started offsite due to causes outside the control of the plant’s owner. Those uncontrollable causes resulted in the main generator output breakers opening as designed.

By procedure, the operators were supposed to trip the main generator. Failing to do so resulted in the unnecessary closure of the MSIVs and the loss of the normal makeup cooling flow to the reactor vessel.

By procedure, the operators were supposed to manually start the RCIC system to provide backup cooling water flow to the reactor vessel. But they failed to properly start the system and it immediately tripped.

Procedures are like recipes—positive outcomes are achieved only when they are followed.

The operators resorted to using the HPCS system. It took about a minute for the HPCS system to recover the reactor vessel water level—the operators left it running in “idle” for the next three hours and 42 minutes during which time about 5 gallons per minute leaked into the reactor building. The leak was through eroded gasket material that had been identified as improper for this application nearly a decade earlier, but never replaced.

Defense-in-depth is a nuclear safety hallmark. That hallmark works best when operators don’t bypass barriers and when workers patch known holes in barriers. Luckily, other barriers remained effective to thwart this near-miss from becoming a disaster. But luck is a fickle factor that needs to be minimized whenever possible.

Safer Blood Products: One Researcher’s Story on Why Federal Support Matters

UCS Blog - The Equation (text only) -

In 1982, a crisis was beginning to unfold. Gay men were dying of an unknown cause, which years later was shown to be the Human Immunodeficiency Virus (HIV).  At that time, I was not involved with the gay community, with acquired immunodeficiency syndrome (AIDS), or with HIV. But federal funding of my research on blood products helped us prevent the transmission of HIV and hepatitis to tens of thousands of Americans.

I led a small team of research scientists at the New York Blood Center (NYBC) interested in developing new therapeutic products from plasma, the fluid portion of blood. What was known in 1982 was that a plasma product called AHF used in the treatment of hemophilia occasionally transmitted hepatitis B virus and transmitted another virus eventually to be known as hepatitis C. The risk of hepatitis C in this patient group was accepted because the infection was believed to be mild and the benefit of treating the patient with the plasma product was great.

The challenge

If we were going to succeed in developing new plasma products useful to large numbers of patients, such as ones that accelerate wound healing, we had to eliminate viral risk. The only way of doing this with certainty was the use of viral killing methods. The challenge was to find methods that would kill large quantities of virus without damaging the therapeutic protein.

Finding a solution

Supported by the virology laboratories and others at NYBC and based on preliminary studies demonstrating virus kill, in 1983 we received an award from the National Institutes of Health (NIH) totaling just over $750,000 for the “Detection and inactivation of non-A, non-B hepatitis agents in blood”. This award enabled us to greatly accelerate our work which, by that time, included exploring the use of organic solvents and detergents such as had been used in the preparation of viral vaccines. The idea was to disrupt viral structures by stripping away essential fatty acids with the hope that the proteins of interest would be unaffected. Our hopes were fully realized.

We showed that the method we developed, commonly referred to solvent/detergent or SD treatment, completely inactivated hepatitis B and C viruses in a chimpanzee model, and, in collaboration with Dr. Gallo at the NIH, we showed that HIV was rapidly and completely inactivated. As importantly, the valuable proteins such as AHF appeared to be unaffected.

Based on these results, the Food and Drug Administration (FDA) licensed the NYBC’s plasma product for the treatment of hemophilia in 1985. More complete clinical studies run cooperatively by NYBC and the FDA showed that the AHF protein was undamaged and HIV and hepatitis viruses were not transmitted.

Impact

For the next fifteen years, over 60 organizations worldwide adopted SD technology and applied it to a wide variety of products including AHF, intravenous immune globulin used in the treatment of immune deficiency disorders, and even monoclonal antibodies and other recombinant technology derived proteins. Hundreds of millions doses of SD-treated products have been infused in people; countless transmissions of Hepatitis B, Hepatitis C, and HIV were eliminated; and the lives of tens of thousands of patients were saved or improved.

The importance of federal support

Success stories like these are not guaranteed. Without federal support, I am reasonably certain that our findings would have made for a nice publication or two and little else. Additional federal grant support that I received resulted in improving the consistency and viral safety of transfusion plasma, now available broadly, and spawned efforts leading to red cells and platelet products with enhanced viral and bacterial safety.

I am forever grateful for the grant support that I received, and the granting agencies and the nation should take pride in the initiatives they foster. My, no really our story, demonstrates the impact of federal funding and the degree to which the scientific enterprise is a collaborative effort, bringing together many diligent minds from research institutes, private organizations and multiple federal agencies. We should all hope that this continues unabated. Our population deserves it.

 

 

Dr. Bernard Horowitz is recognized internationally for his research on blood viral safety and the preparation and characterization of new therapeutics from blood. He has served on several company scientific advisory boards and as a director of Omrix Therapeutics, Biogentis, Inc., Dermacor, Inc., Protein Therapeutics, and V.I. Technologies, a company he co-founded. At the New York Blood Center, Dr. Horowitz was its Vice President for Commercial Development and a Laboratory Head in its Lindsley F. Kimball Research Institute. He has served as a scientific consultant to the National Institutes of Health, the Food and Drug Administration, the National Hemophilia Foundation, the International Association of Biological Standardization, and the World Health Organization. Dr. Horowitz is the recipient of several prestigious awards, including the Robert W. Reilly Leadership Award from the Plasma Protein Therapeutics Association, the Morton Grove Rasmussen Prize from the American Association of Blood Banks, and the 11th International Prix Henri Chaigneau from l’association francaise des hemophiles. Dr. Horowitz received his B.S. in biology from the University of Chicago and his Ph.D. in biochemistry from Cornell University Medical College.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

An Innovative Way to Encourage Disaster Preparedness: FEMA’s Public Assistance Deductible

UCS Blog - The Equation (text only) -

The Federal Emergency Management Agency (FEMA) recently outlined a new framework for encouraging states to invest in disaster resilience and thus limit the growing costs of disasters.

Today is the comment deadline for the ‘Public Assistance deductible,’ a concept that can help protect communities and ensure federal taxpayer dollars are spent wisely. The Union of Concerned Scientists is filing comments supportive of this idea, with some important recommendations for improvements.

What is the Public Assistance program?

FEMA’s Public Assistance (PA) program provides funding for local, state, and tribal governments to help communities recover from major disasters. The PA program provides funding for debris removal; life-saving emergency protective measures; and the repair, replacement, or restoration of disaster-damaged publicly owned facilities and the facilities of certain private non-profit organizations.

Federal taxpayers pay for at least 75 percent of the eligible costs, and state and local entities pay the rest. According to FEMA, on average, the Public Assistance program has provided approximately $4.6 billion in grants each year over the past decade.

Ideally, program funds would be invested in ways that ensure communities build back in a stronger, more resilient way. But in practice this is not always the case.

After disasters, there is often a strong impetus to build back the same way, right where things were before—which may not be the best long-term choice. For example, in many places sea level rise is worsening risks of coastal flooding. Storm surges, riding on higher sea levels, will be able to reach further and higher inland causing greater damage. Post-disaster recovery efforts need to take into account scientific data and projections of sea level rise, and not perpetuate risky rebuilding in harm’s way.

An innovative model: The Public Assistance Deductible

Under FEMA’s proposed design for this deductible, states would first have to meet a minimum threshold of expenditures on post-disaster recovery before FEMA would provide federal assistance through the PA program. The state-specific deductibles would be calculated using a formula that starts with a common annual national base deductible equivalent to the median average amount of Public Assistance received across all 50 states in the past 17 years. This would then be adjusted by taking into account a state’s fiscal capacity and its disaster risk relative to other states.

The truly innovative part of the proposal would be that FEMA would allow states to buy down their deductible through credits earned for state-wide measures that would help build resilience and lower the costs of future disasters.

FEMA’s framework would include $3 in deductible credit for every $1 in state spending on qualifying mitigation activities. These could include protective building codes, measures to safeguard or enhance wetlands and other nature-based flood protections, flood risk management standards (that require building at least 2-3 feet above base flood elevation), and state investments in mapping, tools, and enhanced hazard mitigation plans to help identify and reduce disaster risks.

For further details on FEMA’s PA deductible framework, including the calculation of the state-specific deductibles and credits, see the Supplemental Advanced Notice of Proposed Rulemaking (SANPRM).

Why UCS supports a well-designed Public Assistance Deductible

UCS supports the concept of a well-designed deductible for the PA program. We believe this could be an effective way of addressing a number of priorities for reform of federal disaster funding previously identified by the Government Accountability Office, Congress, and by the Department of Homeland Security’s Office of the Inspector General.

FEMA has been urged to find ways to reduce the costs of federal disasters. Rather than use blunt, inequitable methods that just transfer costs from the federal government to state, local, and tribal governments, we are encouraged that FEMA is exploring ways to help lower the costs of disasters through hazard mitigation measures. The PA deductible program would give states an incentive (through the crediting mechanism) to take pre-disaster protective actions to make communities more resilient to disasters.

With careful attention to the details of such a proposal, this type of concept could help advance community resilience and ensure that taxpayer dollars are wisely invested.

Goals of a Public Assistance Deductible

Studies show that investments in pre-disaster risk reduction measures have a payback of at least 4 to 1 and estimates of the benefits of investments in flood mitigation are higher at 5 to 1. Studies from the Wharton School at the University of Pennsylvania and Swiss Re indicate that higher design standards have a far higher payback than 4 to 1. Yet our nation continues to under-invest in sensible pre-disaster hazard mitigation measures, even as more lives are lost and the costs of weather and climate disasters grow.

In our comments, we have urged FEMA to ensure that the PA deductible program:

  • Provides a strong incentive for states to invest more in pre-disaster preparedness measures that would help limit harm to people, property, and natural functions of ecosystems over the long-term and ensure that taxpayer dollars are spent wisely. We’ve recommended that FEMA expand the list of measures that qualify for credits to include state-wide protective freeboard regulations that go above the minimum federal requirements, investments in hazard mitigation (with extra credit for nature-based, green measures such as wetlands and living shorelines and less credit for hardening measures such as levees), enhanced design standards, building codes, protective zoning standards, statewide flood mapping (e.g. North Carolina), and strategic plans and incentives for managed retreat from high-risk areas.
  • Is administered in an equitable way to help protect the most vulnerable communities, especially low-income and minority communities. We have recommended that FEMA provide extra credits for investments made to promote resilience in these disadvantaged communities.
  • Takes into account the best available science on growing climate risks including: flooding worsened by sea level rise and more frequent and heavy downpours; droughts; wildfires; and other types of impacts related to climate change. We strongly recommend that the PA deductible is designed in a way that appropriately incorporates climate projections and future risks. The hazard model (Hazus) that FEMA is proposing to use to assess state risks is not currently configured to do this, and must be appropriately updated if it is used.
  • Fairly takes into account the specific circumstances of states, so that they are able to access the much-needed aid and recovery resources that are their due in the wake of major disasters, in line with the provisions of the Robert T. Stafford Disaster Relief and Emergency Assistance Act.

A science-based, equitable approach to building resilience to disasters will go a long way toward protecting people, property and the functions of our natural ecosystems, while ensuring the best use of taxpayer dollars. UCS has developed a set of principles that can help inform this type of approach, Toward Climate Resilience: A Framework and Principles for Science-Based Adaptation.

A step toward building climate resilience

The PA deductible concept is an important step toward making our nation more resilient to disasters. We’ll be looking for expeditious action from FEMA to take our and other public comments into account and move this forward toward a rulemaking process.

Congress and the Trump administration also need to act on multiple fronts to help communities facing the impacts of climate change and other disasters. One thing they definitely shouldn’t do is make harmful cuts to FEMA’s budget.

The Public Assistance Deductible infographic provides a snapshot overview of the basic concept and an example of how one State, in this case Indiana, could earn credits toward offsetting the deductible amount.  Source: FEMA

Is No Place Safe? Climate Change Denialists Seek to Sway Science Teachers

UCS Blog - The Equation (text only) -

Co-Authors: Glenn Branch, Deputy director of NCSE, and Steven Newton, Programs and Policy Director at NCSE

A few weeks ago, science teachers across the country began to find strange packets in their school mailboxes, containing a booklet entitled “Why Scientists Disagree About Global Warming” (sic), a DVD, and a cover letter urging them to “read this remarkable book and view the video, and then use them in your classroom.”

Heartland Institute Report Cover Spring 2017 with "Not Science" stamp

“Not Science” stamp on top of the report cover mailed to teachers during spring 2017. The report misrepresents the fact that nearly all climate scientists agree about human-driven climate change.

The packets were sent by the Heartland Institute, which in the 1990s specialized in arguing that second-hand smoke does not cause cancer. Even though its indefensible defense of the tobacco industry failed, Heartland now uses the same pro-tobacco playbook—touting alleged “experts” to question established science—to argue that climate change is not real.

At the National Center for Science Education, we have almost three decades of experience helping teachers, parents, and students facing creationism in the classroom. A few years ago, we added climate change to our docket. So teachers know that when issues regarding evolution or climate change come up, NCSE is there to help.

This wasn’t Heartland’s first unsolicited mailing of climate change denial material to science teachers, and judging from the reactions we’ve seen, teachers haven’t been fooled by this outing. But here is how we’re advising science teachers to explain why using these materials in any science classroom would be a terrible idea.

1. Virtually every assertion is false, controversial, or at best unclear.

That’s a judgment that might seem to call for a point-by-point rebuttal. But I’m not going to offer such a rebuttal, both because every substantive point in the Heartland mailing is a long-ago-debunked canard (see Skeptical Science passim) and because there is already a place where responsible scientists discuss the evidence for climate change: the peer-reviewed scientific research literature.

If Heartland has such a good case to make, why is it spending thousands of dollars on direct-mailing a self-published report to teachers, instead of trying to convince the relevant scientific community?

2. Heartland represents what is, at best, a fringe position in science.

Of course, Heartland isn’t willing to admit its fringiness, devoting considerable effort to trying to dispute the widely reported fact that the degree of scientific consensus on anthropogenic climate change is about 97 percent. It’s a wasted effort.

Multiple independent studies, using different sources, methods, and questions, have arrived at the same conclusion. And the scientific consensus on climate change is not a mere reflection of popular sentiment or shared opinion among scientists. Rather, it is the product of evidence so abundant and diverse and robust as to compel agreement in the scientific community.

3. Heartland even disparages the well-respected, Nobel-Prize-winning, IPCC.

Not content to reject the extraordinary scientific consensus on climate change, the booklet downplays the process by which climate scientists regularly evaluate and report on the state of the evidence, the Intergovernmental Panel on Climate Change or IPCC.

Few areas of science undergo the kind of rigorous and comprehensive review that the climate science community carries out every five years. It is a reflection of the seriousness with which world leaders take the challenge of climate change that they support this process and accept the conclusions arrived at by hundreds of generous, dedicated, and meticulous scientists.

4. Heartland’s material contradicts standards, textbooks, and curricula.

K–12 teachers are expected to teach in accordance with state science standards, state- or district-approved textbooks, and district-approved curricula, all of which undergo review by competent scientists and teachers, and thus generally attempt to present climate change in accordance with the scientific consensus. Heartland’s materials have not undergone such a review. And teachers who misguidedly use them in the classroom will be, at best, presenting mixed messages, running the risk of confusing their students about the scientific standing of anthropogenic climate change.

5. Heartland’s citations are shoddy and its tactics dishonest.

Many of the references in “Why Scientists Disagree About Global Warming” (sic) are to Heartland’s own publications, post on personal blogs, fake news sources, and low-quality journals—the sort of citations that a teacher wouldn’t accept on a science assignment.

The booklet itself is credited to the Nongovernmental International Panel on Climate Change, NIPCC—likely to be confused with the legitimate IPCC. And the envelope in which the mailing was sent reproduced a New York Times headline about “Climate Change Lies”—the same sort of lies, it turns out, that Heartland is concerned to promote.

In the end, the climate change deniers at the Heartland Institute have no scientifically credible evidence of their own, leaving them with no option but to lash out at the real scientific literature, contributing nothing except vitriol, achieving nothing except confusion. Science teachers know better—and science students deserve better.

Two for One: A Very Bad Deal for Our Nation

UCS Blog - The Equation (text only) -

Imagine you are in the market for a new car. You are excited to buy one with a new technology that will warn you of an imminent crash so you have enough time to hit the brakes to save your son’s or daughter’s life and your own. The car salesman tells you he’s got just the car for you, and it comes with his new two-for-one deal. To get that one new feature, you have to give up two others, brakes and seat belts.

You’d never take that deal, but it is exactly the kind of situation the President has created for the National Highway Traffic Safety Administration (NHTSA) and every other agency responsible for protecting American’s health and safety.

This “two-for-one” executive order, signed January 30th, 2017, requires every agency to get rid of at least two regulations for every new one they seek to put in place to help make American’s lives better off. Making matters worse, the health, safety, and other regulations that must be eliminated must at least offset the industry investment required to meet the new regulation–regardless of the benefits of the new or older regulations!

So, take my not-so-hypothetical example above. When I was NHTSA’s Acting Administrator, we put out an advanced notice of proposed rulemaking that would require new cars to come equipped with radios that would allow them to “talk” to one another, sharing basic safety information that would allow a car car to warn the driver of another equipped vehicle on a collision course. This vehicle to vehicle, V2V, communication system is estimated to prevent 425,000–524,500 crashes per year when fully implemented. Saving lives and avoiding injuries would deliver savings of $53 to $71 billion, dwarfing the investments automakers would have to make to equip vehicles with the new technology, therefore delivering positive net benefits within 3-5 years.

But under the “two-for-one” executive order, those benefits just don’t matter, the lives saved and injuries avoided just don’t matter. Instead, other regulations, like those requiring seat belts and brakes, would need to be repealed to offset the investment costs… again, ignoring the lives lost and harmed along the way. And if those two don’t cut the costs to industry enough, even more would need to be eliminated, putting even more lives at risk.

When you consider that in 2015 alone, 35,092 people lost their lives and 2.44 million people were injured in traffic crashes in the United States, it is clear that the “two-for-one” executive order is a very bad deal for our nation.

Making matters worse, this same raw deal applies to fuel economy standards that NHTSA is set to finalize for 2022-2025 to help nearly double fuel economy compared to where we were at the beginning of the decade. So, will NHTSA have to repeal safety standards to make more room to cut the high cost of our oil use? I expect they would never make that trade. I expect it would be the same for the Department of Energy (DOE), where I had the opportunity to help establish efficiency standards for household and commercial appliances. I don’t think the DOE would repeal appliance efficiency standards that are estimated to save consumers more than $2 trillion by 2030 if they had to both offset the industry investment costs of new ones and ignore the benefits of them all.

The “two-for-one” executive order is good for only one thing: grinding to a halt federal efforts to save lives, protect our health, and help us spend less money fueling our cars and heating and cooling our homes.

Appendix: Background on Regulation at NHTSA

Managing Nuclear Worker Fatigue

UCS Blog - All Things Nuclear (text only) -

The Nuclear Regulatory Commission (NRC) issued a policy statement on February 18, 1982, seeking to protect nuclear plant personnel against impairment by fatigue from working too many hours. The NRC backed up this policy statement by issuing Generic Letter 82-12, “Nuclear Power Plant Staff Working Hours,” on June 15, 1982. The Generic Letter outlined guidelines such as limiting individuals to 16-hour shifts and providing for a break of at least 8 hours between shifts. But policy statements and guidelines are not enforceable regulatory requirements.

Fig. 1 (Source: GDJ’s Clipart)

UCS issued a report titled “Overtime and Staffing Problems in the Commercial Nuclear Power Industry” in March 1999 describing how the NRC’s regulations failed to adequately protect against human impairment caused by fatigue. Our report revealed that workers at one nuclear plant in the Midwest logged more than 50,000 overtime hours in one year.

Barry Quigley, then a worker at a nuclear plant in the Midwest, submitted a petition for rulemaking to the NRC on September 28, 1999. The NRC issued regulations in the 1980s intended to protect against human impairment caused by drugs and alcohol. Nuclear plant workers were subject to initial, random follow-up, and for-cause drug and alcohol testing. Quiqley’s petition sought to extend the fitness-for-duty requirements to include limits on working hours. The NRC revised its regulations on March 31, 2008, to require that owners implement fatigue management measures. The revised regulations permit individuals to exceed the working hour limits, but only under certain conditions. Owners are required to submit annual reports to the NRC on the number of working hour limit waivers granted.

The NRC’s Office of Nuclear Regulatory Research recently analyzed the first five years of the working hour limits regulation. The analysis reported that in 2000, the year when the NRC initiated the rulemaking process, more than 7,500 waivers of the working hour limits suggested by Generic Letter 82-12 were being issued at some plants while about one-third of the plants granted over 1,000 waivers annually. In 2010, the first year the revised regulations were in effect, a total of 3,800 waivers were granted for the entire fleet of operating reactors. By 2015, the number of waivers for all nuclear plants had dropped to 338. The Grand Gulf nuclear plant near Port Gibson, Mississippi topped the 2015 list with 69 waivers. But 54 (78%) of the waivers were associated with the force-on-force security exercise.

The analysis indicates that owners have learned how to manage worker shifts within the NRC’s revised regulations. Zero waivers are unattainable due to unforeseen events like workers calling in sick and tasks unexpectedly taking longer to complete. The analysis suggests that the revised regulations enable owners to handle such unforeseen needs without the associated controls and reporting being an undue burden.

The regulatory requirements adopted by the NRC to protect against sleepy nuclear plant workers should let people living near nuclear plants sleep a little better.

Will Scott Gottlieb Comply with Industry Plea to Stall Added Sugar Label?

UCS Blog - The Equation (text only) -

President Trump’s nominee to head the U.S. Food & Drug Administration (FDA), Scott Gottlieb, faced the Senate in his nomination hearing on Wednesday, during which he implied that delayed implementation of the science-based nutrition facts label revision would be possible if he is confirmed.

Yes, you read that right. The future chief of an agency dedicated to protecting public health is already hinting at his willingness to do industry’s bidding to push back enforcement of a rule based in solid science that would help us make informed food purchasing decisions to improve our health. But his alignment on industry talking points is not completely shocking. Mr. Gottlieb has a long list of ties to industry, including an extensive financial and professional relationship with several pharmaceutical companies that manufacture and sell opioids.

During the hearing, Senator Pat Roberts told Gottlieb that the deadline of summer 2018 was not enough time for industry to make the required label changes, including the new added sugar line, especially considering that companies will have to include biotechnology disclosures on labels soon as well. To the question of whether he would “work to ensure proper guidance is available and consider postponing the deadline for the Nutrition Facts Panel to help reduce regulatory burdens?” Gottlieb didn’t explicitly say he would postpone the deadline but might as well have:

“This is something that I do care about and I look forward to working on if I am confirmed,” Gottlieb said. He continued to explain that he is, “philosophically in favor of trying to make sure we do these things efficiently, not only because it imposes undue costs on manufacturers to constantly be updating their labels, but we also have to keep in mind it creates confusion for consumers if the labels are constantly changing…you want to try to consolidate the label changes when you are making [them] as a matter of public health so that the information is conveyed accurately and efficiently to the consumers.”

Why delay?

The delay tactic is often used by industry as a fallback plan, once they’ve failed altogether to stop a science-based policy that might impact their bottom line. This is old hat for the food industry. Back in December, I wrote about how the Food & Beverage Issue Alliance (a group made up of the biggest food and beverage trade associations, like the American Beverage Association and the Grocery Manufacturers Association) had written a letter to the acting HHS secretary and USDA secretaries asking to delay the implementation of the nutrition facts rule to coordinate with U.S. Department of Agriculture’s biotechnology disclosure rule. Some of the same players doubled-down on a similar letter in March, asking HHS Secretary Tom Price to delay the rule until May 2021 for the same reason. Scott Gottlieb’s remarks at his hearing closely resemble the sentiments contained in both of those letters.

Sound familiar? Time and time again we’ve seen science-based proposed rules never make it to the final stages, or those that are finalized but implementation is soon delayed. Just last week, EPA administrator, Scott Pruitt, issued a proposed rule that would delay implementation of the Risk Management Plan (RMP) amendments 20 months, until February 19, 2019. This move came after several petitions from the American Chemistry Council and a handful of other chemical manufacturing corporations, oil and gas companies, and trade organizations asked the agency to reconsider the rule.

And remember the silica rule? Although the science had been clear for over forty years, it took the Department of Labor longer than necessary to issue a final rule late last year which tightened the standard, thanks to opposition from the American Chemistry Council and the U.S. Chamber of Commerce. Just yesterday, the department announced that the rule’s enforcement would be delayed because the construction industry needs more time to educate its employees about the standard.

Industry’s reaction to rules that protect our public health makes it seem like government is blindsiding them. But it’s not like any of these rules were dropped without warning or without cause. These safeguards take years to gather information for and write, during which industry is given ample opportunity to be involved in the process. FDA first began its work to revise the nutrition facts label in 2004, and the proposed rule which included the added sugar line was issued in 2014. Not exactly rapid response. The fact is that science-based policies threaten business as usual, and therefore industry will use all resources at its disposal to stop or slow progress.

Industry’s excuses are wearing thin

Once again, with clear science on the public health consequences associated with excessive added sugar consumption, we have been waiting long enough for full added sugar disclosure on labels. While we wait, we’re missing out. The estimated benefit to consumers of the revisions to the nutrition facts label consumers would be $78 billion over 20 years, not to mention the less quantifiable benefits that come with the right to know how much added sugar is in the foods we buy and eat.

The majority of companies already have until 2019 to make the new changes to their labels and larger food companies like Mars, Inc. have said they could meet the July 2018 deadline just fine.

It’s clear that industry is turning this science-based decision into a political one, at the expense of Americans who will remain in the dark about how much added sugar is in their food for even longer. As National Public Health Week draws to a close, I can’t help but think about the urgent need for progress now, not in four years, if we’re to improve the health of this country, let alone become the healthiest nation by 2030. If Mr. Gottlieb secures the FDA secretary position, he must remember that he is beholden to our public health, not the pharmaceutical or food industry’s bottom line.

Here’s What the EPA Budget Cuts in a Leaked Memo Mean for Health and Environmental Justice

UCS Blog - The Equation (text only) -

Recent news reports point to a leaked memo that provides more details about the Trump administration’s proposed deep cuts to the Environmental Protection Agency’s (EPA’s) budget. If the details are officially confirmed, it would clearly show that the administration is preparing to undermine health protections nationwide, and especially in low income and minority communities. The administration is also seeking to undercut the role of sound science at the agency.

Congress should refuse to allow these harmful cuts to go forward.

How the budget cuts hurt the EPA’s work

Here’s the big picture: If implemented, the deep budget and staffing cuts proposed by the Trump administration would undermine the core mission of the EPA to protect human health and the environment. There is simply no way for the agency to continue to do its job well while losing about a third of its overall budget, with even deeper cuts to many critical programs.

Here are just three of the many important aspects of the EPA’s work that are harmed by the proposed budget cuts outlined in the leaked memo:

1. Programs critical for public health, the environment and the economy of states.

The Trump administration is attempting to cut budgets and funding for programs that are critical for states. These include:

  • Cuts to grants for state, local and tribal management of air and water quality. These grants are critical for state and local authorities to monitor and enforce air and water pollution safeguards. UCS President Ken Kimmell, former Commissioner of the Massachusetts Department of Environmental Protection, recently explained how states are in no position to make up for shortfalls that arise from EPA budget and staffing cuts. This will inevitably threaten public health protections.
  • Cuts to Children’s Health Program resources. The leaked memo says “This decision reduces Children’s Health program resources by $2,391K and 14.9 FTE to prioritize core environmental work.” Wow, that’s stunning! So protecting children’s health is NOT core work for the EPA? That would be news to the American public.
  • Total elimination or cuts to many EPA regional programs, including ones focused on the Chesapeake Bay, the Gulf of Mexico, the Great Lakes, South Florida, San Francisco Bay and Puget Sound. All these programs not only help reduce pollution, they are also vital for the regional economies. The Chesapeake Bay program, for example, is a collaborative effort between Delaware, Maryland, New York, Pennsylvania, Virginia, West Virginia, the District of Columbia, the Chesapeake Bay Commission and the EPA, focused on reducing the pollution load in the historically beleaguered Bay and thereby supporting local economies, fishing, swimming, tourism, and protecting drinking water sources (with benefits accruing in waterways well beyond the Bay itself.)
  • Major cuts to the budget of the Office of Enforcement and Compliance Assurance, including cuts to Civil and Criminal Enforcement and Compliance Monitoring. It’s really hard to see these cuts as anything but a sellout to polluting industries. Robust enforcement is what gives teeth to our nation’s pollution laws, including the Clean Water Act and the Clean Air Act.
  • Cuts to Superfund enforcement. Superfund sites are among the most polluted sites in the country and EPA works to help clean up hazardous waste and monitor these sites. Take a look at this map and see if you have one of the Superfund sites that made the National Priority List for clean-up near where you live. Just to give a sense around the country: Alaska has 10 Superfund sites, Tennessee has 28, Alabama has 18, California has 112, and Maine has 16. If you live in or near one of the sites that still need remediation, cuts to the EPA’s budget could directly affect you.
  • Cuts to programs that help reduce the risk of pesticides to human health and the environment. Administrator Pruitt has already set a bad precedent through his decision not to ban chlorpyrifos, a pesticide that poses a clear risk to children, farm workers, and rural drinking water users. Cuts to budgets for programs that limit pesticide risks would just continue down that misguided path.
2. Protections for environmental justice communities, especially low-income, minority and tribal communities

Because EPA’s core mission is the protection of public health, its activities are especially important for communities that bear a disproportionate burden of health impacts from pollution. Many of these environmental justice (EJ) communities are low-income, minority and tribal communities. Harms to these communities will be especially pronounced if the EPA’s overall budget is slashed.

As a quick reminder, here’s how the EPA defines environmental justice:

Environmental justice is the fair treatment and meaningful involvement of all people regardless of race, color, national origin, or income, with respect to the development, implementation, and enforcement of environmental laws, regulations, and policies. 

The agency says this goal will be achieved for all communities and people when everyone enjoys:

  • the same degree of protection from environmental and health hazards, and
  • equal access to the decision-making process to have a healthy environment in which to live, learn, and work.

It’s hard to see who would be opposed to these fundamentally fair and commonsense goals, but it’s entirely in keeping with an administration that has shown itself to be hostile to concerns about racial justice across the board.

In addition to overarching budget cuts that will disproportionately hurt EJ communities, the administration is also proposing to cut specific EPA programs targeted at disadvantaged communities. That’s gratuitously cruel, especially given the small budgets associated with these programs.

Here’s a list of some of the most egregious cuts to EJ priorities: elimination of the EPA’s Office of Enforcement and Compliance Assurance’s Environmental Justice program (and its small grants program); cuts to budgets for compliance with Title VI of the Civil Rights Act, elimination of the lead risk reduction program and state grants for lead monitoring and enforcement; and cuts to the Brownfields program that helps remediate contaminated sites and revitalize communities.

Consider the cuts in funding for lead risk reduction programs. States and local jurisdictions simply do not have the funding or the expertise to make up for cuts in federal funding for these vital programs. According to the CDC, which maintains the latest county-level data for lead levels:

Today at least 4 million households have children living in them that are being exposed to high levels of lead. There are approximately half a million U.S. children ages 1-5 with blood lead levels above 5 micrograms per deciliter (µg/dL), the reference level at which CDC recommends public health actions be initiated.

Lead exposure has serious consequences for the health of children, and can result in behavior and learning problems, lower IQ and hyperactivity, slowed growth, hearing problems, and anemia. What’s more, according to the CDC, African American children are three times more likely than white children to have elevated blood-lead levels, amounting to a public health crisis in some places.

Or consider the work the EPA is doing to help address air quality concerns in tribal communities in Alaska. Pollution from diesel emissions, indoor air quality concerns, and emissions from burning solid waste and from wood-burning stoves are among the serious challenges these communities face.

Just last year the EPA provided grants totaling over $500,000 through the Brownfields program to Chattanooga and Knoxville, TN. These grants will help disadvantaged communities clean up and revitalize contaminated sites, which in turn will boost the local economy and improve public health. There are many Brownfields success stories around the country.

The recent resignation of Mustafa Ali, a key leader of the EPA’s environmental justice program, is a sad commentary on where this work is likely to be headed under Administrator Scott Pruitt. In his resignation letter addressed to Administrator Pruitt, Ali said:

“When I hear we are considering making cuts to grant programs like the EJ small grants or Collaborative Problem Solving programs, which have assisted over 1,400 communities, I wonder if our new leadership has had the opportunity to converse with those who need our help the most.”

3. Scientific research and data, most prominently climate science

Many aspects of the EPA’s scientific work are under attack, including all of its work related to climate change. Perhaps this is only to be expected under an administration that is peddling a new form of climate denial, but that doesn’t diminish how outrageous these actions are.

(In case you missed it, watch EPA Administrator Scott Pruitt’s widely-panned appearance on Fox News where he continued his dissembling on the “CO2 issue.” The relevant excerpt starts at the 5:08 mark.)

The Trump administration is aiming to eliminate the Office of Air and Radiation’s Climate Protection Program. This program works with state, local and tribal entities to provide expertise on climate solutions including energy efficiency, renewable energy and adaptation to climate impacts. At a time when the seriousness consequences of climate change are so clear, this type of help is sorely needed.

But that’s not all: Trump’s budget proposes to cut funding for the EPA’s Science Advisory Board (SAB), a source of independent peer review for the agency’s scientific and technical information and scientific advice for the EPA Administrator. Congress directed the EPA to set up the SAB in 1978 and it has served a very important role through multiple administrations to help ensure science-based policymaking. The leaked memo literally says that cuts to the funding and staffing for the SAB “reflect an anticipated lower number of peer reviews.” I suppose that means this administration has arbitrarily decided to deprioritize independent science and scientific oversight, a losing proposition for the American public.

In addition, the EPA’s Environmental Education and Regional Science and Technology programs are targeted for elimination. The RS&T program works together with a network of regional laboratories around the country to bring good science to bear on environmental protection measures.

My colleague Dave Cooke highlights other important harms related to potential loss of funding for the EPA Vehicle Lab. And Karen Perry Stillerman has written about the impacts of loss of funding for EPA’s work on clean water.

Congress must resist harmful cuts to the EPA budget 

Some of the broader details of the leaked memo accord with the budget blueprint released by the administration last month, which would indicate that these are likely to be real threats. Senators and Representatives should consider the destructive impacts on their constituents in their home states and speak out against the decimation of the EPA’s budget and staffing.

It’s especially important to elevate the concerns of communities that have historically been sidelined and face a disproportionate burden of pollution. Let’s not have another Flint water crisis, or Elk River chemical spill, or Kingston coal ash spill.

Mustafa Ali’s resignation letter, addressed to Administrator Pruitt, also says:

“I strongly encourage you and your team to continue promoting agency efforts to validate these communities’ concerns, and value their lives.”

Ultimately, that’s what this is about: Not just budget and staffing numbers at the EPA, but the impact on the lives and well-being of people around the country. Congress, which has the final say on the federal budget, must strenuously resist these cuts to the EPA’s budget.

Photo: EPA

Americans Are Worried about Water Pollution (And They Should Be)

UCS Blog - The Equation (text only) -

Apparently the Trump administration hasn’t heard about the latest Gallup poll, which puts Americans’ concerns about water pollution and drinking water at their highest levels since 2001. Why do I say this? Because in addition to rolling back a key Obama-era clean water rule, a leaked EPA memo reveals that the administration intends to slash or eliminate funding for a slew of water programs and initiatives. And while recent and ongoing crises like the one in Flint have highlighted urban drinking water problems, it is also true that rural communities—whose voters helped put President Trump in office—have plenty to worry about.

Gallup’s annual Environment Poll found that 63 percent of Americans worried “a great deal” about pollution of drinking water, and 57 percent have a similar level of concern about pollution of rivers, lakes and reservoirs. Such levels of concern about drinking water were highest among non-white and low-income groups, but were reported by majorities of respondents across racial and income lines.

The Trump administration is trashing clean water protections

Against this backdrop of Americans’ rising water worries, President Trump is taking actions that will actually make the nation’s waters dirtier. First he staffed his administration with Big Ag and Big Oil boosters, including his EPA chief Scott Pruitt. Then he signed an executive order to begin undoing the EPA’s Clean Water rule, over which (not coincidentally) Pruitt sued the EPA while serving as Oklahoma attorney general. To emphasize his disdain, the President called the rule, “horrible, horrible.”

But what’s really horrible is what the Trump administration did next. As the Washington Post reported last Friday, a leaked EPA memo sheds new light on the budget cuts previewed a few weeks earlier. My colleagues have documented how cuts will impact clean vehicle programs and climate research, so here I’ll focus on implications for EPA’s clean water work. Bottom line: it’s worse than you thought. The memo names at least 17 water-focused programs and sub-programs slated for total elimination, and others that would face sharply reduced funding. By my tally, the cuts to EPA Office of Water programs total more than $1 billion.

That’s deeply troubling, because when the administration yanks precious dollars from clean water programs, people and communities suffer. Whether it’s cleaning up pollution in Lake Michigan, restoring wetlands around Puget Sound, preventing farm runoff into the Chesapeake Bay, or testing drinking water in rural Maine that doesn’t happen because there’s no money and no staff, people will be hurt. People’s health, people’s recreational opportunities, people’s livelihoods. And costs that could have been averted balloon instead.

Water worries are rising in farm country

It’s not just urban or industrial communities that will suffer from the Trump administration’s budget cutting. The Washington Post reported last weekend on the irony that many cuts would disproportionately hurt rural communities that supported him, because they rely heavily on federally-funded social programs. The article didn’t mention water pollution, but it’s a fact that water supplies in (and downstream from) agricultural areas bear a heavy burden of contamination from farm runoff. High levels of fertilizer-derived nitrates in drinking water, which can cause severe health problems in infants, are a particular concern. The USDA has estimated the cost of removing agricultural nitrates from public water supplies at about $1.7 billion per year, and the total cost of environmental damage from agricultural nitrogen use has been estimated at $157 billion annually. Rural communities and cities like Des Moines, Iowa, are struggling to deal with the problem. And cuts to EPA monitoring and cleanup programs in rural areas could just make it worse.

A false choice

When the Trump administration talks about gutting environmental protections, their argument seems to boil down to, “because jobs.” But that’s a false choice. And the damage industrial agriculture wreaks on the nation’s water resources is a prime example. It affects millions of Americans—rural and urban water consumers, of course, but also taxpayers responsible for pollution cleanup, and boaters, fishers, and business operators that depend on clean water. And it affects farmers, because they too need clean water and healthy soil to be able to keep farming over the long term.

Last summer, UCS documented the potential benefits to farmers, taxpayers, and businesses from an innovative farming system integrating strips of perennial native prairie plants with annual row crops. Researchers who developed the system in Iowa found that by planting prairie strips on just 10 percent of farmland, farmers could reduce nitrogen loss in rivers and streams by 85 percent, phosphorus loss by 90 percent, and sedimentation by 95 percent. And this is all while maintaining farm productivity.

UCS further estimated that the prairie strips system, if adopted across the nation’s 12-state Corn Belt, would generate more than $850 million per year in net savings to farmers and society from reductions in fertilizer use and surface water runoff. In the coming weeks, we’ll follow up with analysis of another farming system based on extended crop rotations, which also promises to keep farmers profitable while reducing pollution.

Smart farm policy can deliver clean water and rural prosperity

This is timely, because Congress is already at work on the 2018 farm bill, that massive piece of legislation that comes around every five years and shapes the nation’s food and farming system. And while the Trump administration has shown utter disregard for the environment that all Americans depend on, for scientific evidence of what works, and even for the particular needs of the farmers and rural voters who put him in office, we’re betting that more reasonable voices will prevail. We’re mounting a campaign to protect the nation’s precious water resources while simultaneously improving farmers’ yields and creating economic opportunities in rural communities. We will mobilize UCS supporters, form common cause with farmer organizations, and join with other allies to call for policies that invest in such systems. Stay tuned.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs