Combined UCS Blogs

New Arctic Climate Change Report: Stark Findings Confront Secretary of State Tillerson Ahead of G7

UCS Blog - The Equation (text only) -

On May 11, US Secretary of State Rex Tillerson will cap two years of US chairmanship of the Arctic Council and present progress made over that time and look at likely future directions.

The forthcoming declaration by the Nordic ministers puts climate change front and center in the lead up to this week’s Arctic Ministerial meeting. The world is paying attention and will be looking for how the issue of climate change is addressed in the Arctic Council ministerial statement, including any signals indicating how Secretary Tillerson might characterize future US actions under the Paris Climate Agreement.

SWIPA 2017: Snow, Water, Ice and Permafrost in the Arctic (2017) – long from AMAP on Vimeo.

Stark findings

The Arctic Monitoring and Assessment Program report, Snow, Water, Ice, Permafrost (SWIPA 2017) will be presented at the Arctic Ministerial meeting this week. It includes two stark findings. First, the least bad scenario for sea level rise has gotten a lot worse—what scientists thought was the best possible chance (i.e. the lower end of the confidence range) for a slow and manageable sea level rise under a fully implemented Paris Climate Agreement just got faster and higher. Second, the global costs run into trillions stemming from  the changes over this century in the Arctic region.

One reason we are in suspense is that there is one additional seat at the table—the proverbial seat occupied by the elephant in the room (i.e. evidence from the just-released science report requested by the Arctic Council).

It is likely that a binding agreement for continued scientific cooperation will be signed by the eight Artic Nations. Will the security implications of the SWIPA 2017 report be a cause for recalibration of the mix of investments in climate adaptation and mitigation (i.e. tackling the root causes of accelerating changes in the Arctic)?

The forthcoming Fairbanks Declaration from this tenth Arctic Ministerial may well reverberate around the world with implications for the G7 leader’s summit in Sicily at the end of May.

Arctic warning: Time to update adaptation plans for sea level rise

According to SWIPA 2017, Arctic land ice contributed around a third of global sea level rise between 2004 and 2010. Overall, two-thirds of global sea level rise is attributed to the transfer of water previously stored on land (as ice or underground or in other reservoirs on land) and one-third of global sea level rise is attributed to warming of the ocean.

Global Sea Level Rise Contributions 2004-2010

Global sea level rise is attributed to a third from warming of the ocean and two thirds from the transfer of water previously stored on land (as ice or underground or in other reservoirs on land) to the ocean. Source: AMAP SWIPA 2017

The SWIPA 2017 report compares the “greenhouse gas reduction scenario” (known as RCP 4.5, which also serves as a proxy for an emissions scenario consistent with the long term goals of the Paris Climate Agreement) with the high emissions scenario (known as RCP 8.5 and used as a proxy for business as usual without a Paris Agreement).

It may be time to update adaptation plans to fully take into account more realistic projections of global sea level rise—SWIPA 2017 “estimates are almost double the minimum estimates made by the IPCC in 2013” for global sea level rise from all sources.

The difference between a fully implemented Paris Climate Agreement scenario and business as usual could not be more stark. The report declares that “the rise in global sea level by 2100 would be at least 52 cm  (20 inches) for a greenhouse gas reduction scenario and 74 cm (29 inches) for a business-as-usual scenario.” This is the best estimate likely “lock in” range for minimal, least-cost, coastal adaptation depending on the choices we make to reduce heat-trapping emissions and short-lived climate forcers.

Arctic slush fund: The high costs of displaced communities, melting, flooding, and burning in the Arctic

The Arctic matters to all of us: what happens in the Arctic does not stay in the Arctic. Case in point is the recent economic analysis presented in SWIPA 2017.

Global cumulative costs of changes underway in the Arctic would likely cost $7–$90 trillion US dollars over 2010-2100. The costs include a wide range of climate change consequences, from Arctic infrastructure damage to communities exposed to sea level rise. For comparison, the US annual “real” gross domestic product in 2016 was $18.6 trillion in current dollars.

Implications for the G7 summit and Paris Climate Agreement

The Arctic Ministerial meeting May 11 is a chance for high level officials from the Arctic Council to meet and discuss progress in a setting historically noteworthy for peaceful cooperation to achieve shared goals.

There is a high degree of overlap between the Arctic council members and observer non-Arctic states and the Group of 7 (G7) summit in Sicily a few weeks later. There is also a high degree of overlap with the highest emitting nations and the members of the Arctic Council.

The lessons learned and issues of climate change that are grappled with during the Arctic Ministerial may very well carry through to the G7 forum. After the summit we expect to hear more definitively about US actions regarding contributions to the Paris Agreement going forward.

For the moment, eyes are focused on Secretary of State Tillerson and his remarks in Fairbanks Alaska, and the Fairbanks declaration, expected to be signed on May 11.




5 Reasons Why the Regulatory Accountability Act is Bad for Science

UCS Blog - The Equation (text only) -

Last week, Senator Rob Portman introduced his version of the Regulatory Accountability Act (RAA), a bill that would significantly disrupt our science-based rulemaking process. A version of this inherently flawed, impractical proposal has been floating around Washington for nearly seven years now, and the latest, S. 951, is just as troubling as previous iterations.

The impact of the RAA will be felt by everyone who cares about strong protections and safeguards established by the federal government. Think about food safety, environmental safeguards, clean air, clean water, the toys that your kids play with, the car you drive, workplace safety standards, federal guidance on campus sexual assault, financial safeguards, protections from harmful chemicals in everyday products, and more. You name it, the Portman RAA has an impact on it.

The Portman RAA is at best a solution in search of a problem. It imposes significant (and new) burdensome requirements on every single federal agency charged with using science to protect consumers, public health, worker safety, the environment, and more at a time when Congress and the president are cutting agency resources. It also requires agencies to finalize the most “cost effective” rule, which sounds nice, but in practice is an impossible legal standard to meet and would most likely result in endless litigation. This requirement is emblematic of the overall thrust of the bill, a backdoor attempt to put the interests of regulated industries ahead of the public interest.

Basically, because there isn’t public support for repealing the Clean Air Act, the Clean Water Act, the Consumer Product Safety Act, and other popular laws that use evidence to protect the public interest (including civil rights and disabilities laws, worker protection laws, transportation safety laws, and more), the Portman RAA weakens the ability of agencies to implement these laws by rewriting the entire process by which safeguards for Americans are enacted. In doing so, the Portman RAA would impact everyone’s public health and safety, especially low-income communities and communities of color, which often face the greatest burden of health, environmental, and other safety risks.

For this blog, I have chosen to focus on what the Portman RAA means for the scientific process that is the foundation for federal rulemaking. For information on all of the other troubling provisions in the legislation, legal scholars at the Center for Progressive Reform have a neat summary here.

Here are 5 destructive provisions in the Portman RAA as they relate to science and science-based rulemaking. Bear with me as we take this journey into administrative law Wonkville.

1. The RAA ignores intellectual property, academic freedom, and personal privacy concerns.

S. 951 includes harmful language similar to the infamous HONEST Act (previously known as the Secret Science Reform Act) and applies it to every single agency. While the Portman RAA language (page 7 starting at line 19 and page 25 starting at line 14) includes some exemptions that address the criticisms UCS has made of the HONEST Act, the bill would still require agencies to publicly make available “all studies, models, scientific literature, and other information” that they use to propose or finalize a rule.

The exemptions fall considerably short because the language has zero protections for intellectual property of scientists and researchers who are doing groundbreaking work to keep America great. For most scientists, especially those in academia and at major research institutions, much of this work, such as specific computer codes or modeling innovations, is intellectual property and is crucial for advancement in scientific understanding as well as career advancement.

In effect, this provision of the Portman RAA would prevent agencies from using cutting-edge research because scientists will be reluctant to give up intellectual property rights and sacrifice academic freedom. In addition, many researchers don’t or can’t share their underlying raw data, at least until they have made full use of it in multiple publications.

Given that the research of scientists and the expertise built up by labs is their scientific currency, S. 951’s intellectual property and academic privacy language would lead to one of two outcomes:

  • One, it would stifle innovation, especially when it comes to public health and safety research, as many early career scientists may not want to publicly share their code or computer models and undermine their careers. Scientists could risk all their ideas and work being pirated through the rulemaking docket if a federal agency wanted to use their information as part of the basis for proposing and/or finalizing a regulation.
  • Two, agencies wouldn’t be able to rely on the best available science in their decision-making process because those who have the best information may not want to make their intellectual property public. And of course, agencies are required to propose and finalize regulations based on the best available science. This is even reaffirmed by the Portman RAA (more on that later). Thus, you have a catch-22.

Like the HONEST Act, this language fundamentally misunderstands the scientific process. There is no reason for anyone to have access to computer models, codes, and more, to understand the science. Industry understands this very well because of patent law and because of the trade secrets exemptions (industry data would be exempted from the same disclosure requirements as intellectual property and academic research) but there is no equivalent protection for scientists, whose basic goal is to advance understanding of the world and publish their work.

And while the exemptions attempt to ensure protections of private medical data, they do not go far enough. For example, agencies that rely on long-term public health studies to propose and finalize science-based regulations could still be forced to disclose underlying private health data related to a study participant’s location and more, all of which may lead to someone’s privacy being put at risk.

2. The RAA puts science on trial.

The Portman RAA provides an opportunity for industry to put the best available science that informs high-impact and major rules on trial. In a provision (page 16 lines 13-17)  that reminds me of Senator Lankford’s radical BEST Act, S. 951 will give industry an opportunity to initiate an adversarial hearing putting science and other “complex factual issues that are genuinely disputed” on trial.

But what does it mean for science and other facts to be genuinely disputed? The RAA is silent on that point. Hypothetically, if an industry or any individual produces their own study or even an opinion without scientific validity that conflicts with the accepted science on the dangers of a certain chemical or product (say atrazine, e-cigarettes, chlorpyrifos pesticide, or lead), federal agencies charged with protecting the public using best available science would be forced to slow down an already exhaustive process. The thing is, you can always find at least one bogus study that disagrees with the accepted facts. If this provision had been around when the federal government was attempting to regulate tobacco, the industry would have been able to use it to create even more roadblocks by introducing bogus studies to dispute the facts and put a halt to the public health regulations.

This is just another way to elongate (and make less accessible to the public) an already exhaustive rulemaking process where everyone already can present their views through the notice-and-comment period. This provision plays up the “degree of uncertainty” that naturally exists in science, while ignoring a more sensible “weight of evidence” approach, which is exactly what opponents of science-based rulemaking want. This adversarial hearing process does nothing to streamline the regulatory process, but it does make it harder for federal agencies to finalize science-based public health, safety, and environmental protections. The Scopes-Monkey trial has already taught us that putting science on trial doesn’t work. It was a bad idea nearly 100 years ago, and it’s a bad idea today.

3. The RAA adds red tape to the science-based rulemaking process.

The Portman RAA, ironically, includes duplicitous language that requires proposed and final rules to be based on the “best reasonably available” science (page 8 lines 10-14 and page 25 lines 14-18). The thing is, this already happens. Many underlying authorizing statutes, such as the Clean Air Act, have this requirement, and to the extent that this bill is supposed to streamline the regulatory process, this appears to do the opposite. If anything, this is litigation bait for industry, meaning that the legally obscure language could be used to sue an agency and prevent science-based rulemakings to be implemented.

The thing is, anyone can already challenge the scientific basis of regulations since they are already required to be grounded in facts. This just rests upon a faulty assumption that agencies aren’t doing their jobs. The bottom line? Through this and other provisions, S. 951 adds redundancy and procedure when the supporters of the bill are claiming to get rid of it.

4. The RAA has imprecise language that could force burdensome requirements on agency science.

The Portman RAA uses vague language to define agency “guidance” (page 2, lines 14-16) that could be interpreted to encompass agency science documents, such as risk assessments. For example, if an agency conducts a study on the safety of a chemical, finds a health risk associated and publishes that document, would that study be subject to the burdensome RAA requirements on guidance (i.e. go through a cost-benefit analysis)? The language is ambiguous enough that this remains an open question.

Furthermore, by adding additional requirements for guidance documents, such as cost-benefit analysis, it would make it harder for regulators to be nimble and flexible to explain policy decisions that don’t have the binding effect of law, or to react to emerging threats. For example, the Centers for Disease Control and Prevention (CDC) has frequently used guidance documents to quickly communicate to the public and healthcare providers about the risks associated with the Zika virus, an emerging threat that required a swift response from the federal government. Just imagine the amount of time it would take for the CDC to effectively respond to this type of threat in the future if the agency was forced to conduct a cost-benefit analysis on this type of guidance.

Overall, many agencies use guidance as a means of explaining how they interpret statutory mandates. Because they don’t have the effect of law, they can be easily challenged and modified. The new hurdles simply prolong the guidance process and make it more difficult for agencies to explain interpretations of their legal mandates.

5. The RAA increases the potential for political interference in agency science.

The Portman RAA would give the White House Office of Information and Regulatory Affairs (OIRA) the power to establish one set of guidelines for risk assessments for all of the federal science agencies (page 33 lines 16-18). The thing is, this one-size-fits-all idea is unworkable. Individual agencies set guidelines for risk assessments they conduct because different issues require different kinds of analysis. OIRA is filled with economists who are not scientific experts that can appropriately understand public health and environmental threats. Under this bill, OIRA would also determine the criteria for what kinds of scientific assessments should be used in rulemaking. This office should not have the responsibility to put forward guidelines dictating agency science. This is a clear way to insert politics into a science-based decision. My colleague Genna Reed will be expanding on this point specifically later this week because of how troubling this provision is.

For a proposal that is aimed at streamlining the regulatory process, the question must be asked, for whom? If anything, the Portman RAA grinds the issuance of science-based protections to a halt, and adds additional red-tape to a process that is already moving at a glacial pace.

The bottom line is that this latest version of the RAA, albeit different from previously introduced versions in the Senate and somewhat distinct from the House-passed H.R. 5, leads to the same outcome in reality: a paralysis by analysis at federal agencies working to protect the public from health and environmental threats and a potential halt to the issuance of new science-based standards to ensure access to safe food, clean air and clean drinking water, and other basic consumer protections.

Photo: James Gathany, CDC

Solar Job, Coal Jobs, and the Value of Jobs in General

UCS Blog - The Equation (text only) -

Science isn’t done by guesswork or gut instinct. It requires expertise not only to conduct but to evaluate; in-depth research in a field outside of my own is often beyond my ability to critique. I don’t have the knowledge to review a paper on molecular biology, although I might notice a really blindingly obvious flaw.

I have more knowledge of economics than I do of molecular biology. Even so, it’s not my primary field of expertise, so when I saw a recent post by American Enterprise Institute scholar Mark J. Perry, I was a little confused. His “Bottom Line” includes this: “The goal of America’s energy sector isn’t to create as many jobs as possible… we want the fewest number of energy workers.”

Is there something I’m missing? Is AEI actually saying that jobs are bad?

Don’t we want jobs?

Now, the basic economic argument behind Dr. Perry’s argument is that labor carries a cost, and producers of any sort of good seek to keep their costs down. This is a story as old as civilization. As agriculture improved and we needed fewer hands to work on the farms, people moved to the towns and cities and produced new goods and services. In the Industrial Revolution, machines reduced the number of workers needed to produce a given quantity of goods. Luddite rioters smashed a few machines in protest, but more were built, and society adapted.

In the coal sector, automation cost jobs well before the shale gas boom made its impact felt. It’s important to recognize that even if the economy as a whole benefits by replacing people with machines in repetitive tasks, in the near term the individuals who were doing those tasks—coal miners, but also people in many other fields, such as factory workers, cashiers, truck drivers, even accountants and paralegals—will be adversely affected by losing their jobs. The displaced workers and their communities will face economic, social, and health impacts. We need to plan for these shocks better than we have in the past.

Human labor is not simply a fungible, tradeable commodity like a ton of steel or a bushel of wheat; it is far more complex. Human beings have feelings, needs, and rights. The field of “labor economics” examines this topic in more detail.

Coal and solar jobs

Dr. Perry argues that solar’s greater reliance on human labor is a flaw, not an asset. And in doing so, he makes a very basic mathematical error.

The solar industry employed, depending on the source, 260,000 to 374,000 workers in 2016. Solar power produced 56.22 million megawatt-hours according to the U.S. Energy Information Administration (Dr. Perry mistakenly includes a much lower figure). Meanwhile, coal employed about 160,000 workers and produced 1240 million megawatt-hours. Therefore, solar needed about 35-50 times as many workers per unit of electricity produced.

But here’s the major error: the solar workers in 2016 installed systems that will generate power for decades. Or were researching new solar panel chemistries that might pay off decades in the future. The labor of coal workers in 2016 was largely fueling and operating existing power plants for only that year. On an amortized basis of labor-per-kilowatt-hour, the two technologies would be a great deal closer. By a rough estimate, it would be a ratio of about two to one.

There are also differences in costs per worker. Coal has higher labor costs per employee, when you factor in executive bonuses, hazard pay, and the pension and health benefits that unions secured for workers in decades past. These benefits are necessary because of the health risks faced by coal workers. Solar doesn’t pay badly, but coal mining was historically a very good-paying if risky job, as my colleague Jeremy Richardson points out. Factor that in, and the gap in labor costs per unit of electricity shrinks even further.

So is that the end of the story? No. Labor isn’t the only input to production. Industrial output can be modeled with something called a Cobb-Douglas Production Function. This equation states that production requires multiple inputs, like capital, labor, and materials, and that these can be substituted for one another to a greater or lesser degree.

Since the unsubsidized costs of new solar power have now fallen below those of new coal power, if solar power uses more labor, then it must be making more efficient use of capital and materials (in dollar terms) than coal power. Solar gives more people jobs, uses less other stuff in the process, creates less pollution, and comes out ahead.

The value of solar jobs

Is the labor dependence of solar power a bad thing? Not for the men and women who are actually working in the field. Not for you and me and the other consumers, and our utilities who are buying solar power at record-low price levels. Not for a country seeking to create opportunities for its citizens.

In basic economics, prices tell the story. From that point of view, solar’s labor requirements are clearly not a disadvantage. When we consider that labor and jobs are actual people’s livelihoods, and not just numbers on a page, it becomes clear that the labor requirements are actually beneficial.

What Does Scott Gottlieb’s Leadership Mean for Scientific Integrity at the FDA?

UCS Blog - The Equation (text only) -

Later this afternoon the Senate will vote to confirm Scott Gottlieb as the next U.S. Food and Drug Administration (FDA) commissioner. What does this mean for scientists and science-based policymaking at the FDA? His conflicts of interest are certainly an indication that the pharmaceutical industry will benefit more from his tenure than Americans’ health.

Gottlieb is a medical doctor with extensive ties to the pharmaceutical industry, including GlaxoSmithKline and Vertex Pharmaceuticals, taking $400,000 from drug and medical device companies in consulting and speaking fees between 2013 and 2015. He has not been shy in his criticism of what he calls FDA’s “cumbersome” drug approval process, and has recommended fast-tracking approval by using surrogate markers to gauge the effectiveness of new products. Surrogate markers would allow drug companies to conduct shorter studies with smaller sample sizes. A win for drug companies, but not necessarily for public health.

Many have called President Trump’s band of department heads the “corporate cabinet” because of the astounding degree of conflicts the individuals have with regulated entities, i.e. corporations. They are the epitome of foxes guarding America’s henhouses. Despite the fact that Trump campaigned upon promises to “drain the swamp” of former lobbyists in federal government, his actions during his first several months in office have not supported his pledge. Gottlieb has mentioned that he would recuse himself from decisions within the next year having to do with all 20 companies with which he has had a financial relationship. But after one year, all bets are off. Scott Gottlieb will be making policy decisions that directly impact companies with which he has had financial ties and personal relationships. Judging from his past statements on the FDA regulatory process as well as his work history, it doesn’t seem likely that he will be making completely science-based, objective decisions that weigh the health of Americans over the profits of drug companies.

What FDA scientists think about industry influence on the agency’s mission

Back in 2015, we surveyed several agencies, including the FDA, to find out how scientific integrity was faring under the policies instituted by the Obama administration. Forty-five percent and 33 percent of surveyed FDA scientists said that the level of consideration of political interests and business interests, respectively, at the agency was “too high.”  One-third of respondents agreed that the FDA has been harmed by agency practices that defer to business interests.

When we asked how the mission of the FDA and the integrity of the scientific work produced by the FDA could best be improved, several themes arose. In addition to calls for increased agency funding and staffing, additional transparency, and more opportunities for scientists to attend conferences and collaborate with other scientists, survey respondents also called for less industry influence on science and policy decisions at the FDA. I urge Scott Gottlieb to take a look at some of the following concerns from real FDA scientists about the relationship between FDA and the pharmaceutical industry.

  • Industry should have fewer opportunities to interfere with science process, not more:
    • “Minimize the links between the Agency and the industry it regulates—primarily those that provide money from industry to the FDA and those that lead to future employment opportunities for FDA management.”
    • “Stop having industry make a call to FDA and “put pressure” on an approval.”
  • User fee arrangements have pressured the FDA to cater to industry and approve more drugs:
    • “The mission of those Centers which are funded in large part by user fees has been altered, in my opinion, to cater to industry and whatever their concerns are at the moment. There is an inherent conflict of interest in being paid/funded by the industries we are supposed to regulate. That said, FDA has made great strides in meeting MDUFA (user fee regulations) reduced review timelines and increasing the number of device approvals. However, my concern is that some devices that may not have met their efficacy endpoints—e.g., they don’t work—will get approved/cleared anyway because they don’t pose any safety concerns. I believe we should follow our “science”-based evidence rather than compromise to keep industry happy.”
    • “During the Bush years, our mission shifted from making sure drugs were safe and effective to making sure they were ‘safe, effective and approved in a timely manner.’ (That made no sense because if a drug is safe and effective, it *will* be approved). We need to institute safeguards so that business interests don’t influence decisions, depending on who is in the White House.”
    • “NOT taking money from industry via PDUFA funds and getting all FDA appropriations from the government and not making all the PDUFA quid pro quo to satisfy industry for its money. This PDUFA arrangement/program keeps increasing all kinds of arrangements to benefit industry.”
    • “I think that ADUFA has changed the working environment and there is a more pressure to approve drugs. Sponsors are more vocal in their complaints, whether or not they are justified.”
  • The revolving door at the FDA is dangerous:
    • “By stopping the revolving door of industry people who are brought into high level positions, wreak havoc during their tenure, then return to the industry from which they came.”
    • “Although it obviously would be difficult to achieve due to pressure from politicians and businessmen, it would be better if high level (and other) FDA employees were not part of the revolving door process in which people come from the regulated industry (or law firms representing the regulated industry), work for the FDA and then return to high paid jobs in the private sector.”
    • “I don’t think we should be filling leadership roles at the agency with ex-pharmaceutical company executive. I don’t think the fear of being sued by pharmaceutical companies should limit our regulatory authority.”

While agency scientists seem uncomfortable with the level of corporate involvement in regulations, industry is likely feeling pretty comfortable with the current regulatory landscape for food and drugs. Tom Price, who has invested in and made policy decisions in line with drug manufacturers while a member of Congress, is now the head of the US Department of Health and Human Services. In late April, U.S. Surgeon General Vivek Murthy was removed by the Trump Administration. Murthy had issued a 2016 report on e-cigarettes concluding that use among children and teenagers is a “major public health concern.” Shortly after he left office, the FDA announced it would be delaying enforcement of e-cigarette regulations that were finalized last year.

Last week, a leaked memo revealed that the Trump administration is planning on slashing funding to the White House Office of National Drug Control Policy by 95 percent during a time when heroin and prescription painkiller use in this country and related preventable deaths are spiraling out of control. And as obesity and related health consequences continue to rise,  Mr. Gottlieb has already indicated that he would be open to delaying implementation of changes to the nutrition facts label including an added sugar line, as the food industry has requested. It seems like any regulation that inhibits the ability of drug or food manufacturers to approve and introduce an endless stream of new drugs and food additives will be unpopular under this administration.

A plea from FDA scientists and the public

Mr. Gottlieb: I implore you to stick to the commitment you made during your confirmation hearing to make decisions based on safety and efficacy and to be “guided by the science,” “guided by the expertise of the career staff,” and “guided by impartiality and what’s good for patients as a physician.” As FDA commissioner you must remember that the FDA is beholden to the people, not the pharmaceutical or food industries.

And hey, your new employees agree. To protect scientific integrity at the FDA, surveyed scientists urge the agency (that’s you now!) to “recognize that FDA’s customers are the general public; the sponsoring companies submitting new treatments are not.” And to remember that “clear acceptance that our ‘customer’ is the American people, and our mission is protecting and promoting public heath. Our customers are NOT industry, and performance metrics should not be geared to appeasing industry stakeholders at the expense of the public.” Let the expertise of those career scientists be your guide!

How Oats Could Save Iowa’s Farmers (and Fight Pollution)

UCS Blog - The Equation (text only) -

That bowl of oatmeal pictured above was my breakfast this morning. The strawberries were from nearby Virginia (hello, spring!) but the oats may have come from as far away as Sweden, Finland, or Canada. In the future, my morning oats could be grown much closer to home, in a state like Iowa that is now dominated by corn and soybeans. A new UCS report shows why that would be a good thing for US farmers and our environment.

Today’s Midwestern Corn Belt produces two crops—the aforementioned corn and soybeans—in abundance; however, this system has grown steadily less beneficial for farmers over time. US corn and soybean growers achieved record-high har­vests in 2016. But due to oversupply, prices farmers receive for these crops have plummeted, and 2016 US farm incomes were expected to drop to their lowest levels since 2002.

Endless rotations of corn and soy aren’t environmentally sustainable either. This system typically leaves fields bare for much of the year and uses tillage (plowing) practices that erode away farmers’ soil. It loads on synthetic fertilizer, leading to a nitrogen pollution problem that costs the nation an estimated $157 bil­lion per year in human health and environmental damages.

Rural communities suffer many of the consequences, with Iowa high on the list of states with surface water pollution from fertilizers, pesticides, and eroded soil. And the nega­tive effects extend far beyond the Midwest. Corn Belt watersheds are major con­tributors to the annual “dead zone” in the Gulf of Mexico, and nitrous oxide emissions from farm soils make up 5 percent of the US share of heat-trapping gases responsible for climate change.

Diverse crop rotations offer multiple benefits

Fixing these problems is a little more complicated than simply planting oats, but not a lot. For the last 14 years, Iowa State University researchers have compared the typical Iowa corn-soy system with something that looks just a bit different. Innovative three- and four-year systems add combinations of winter-growing small grains (yes, those oats), an off-season cover crop, and alfalfa, a perennial crop that adds nitrogen to the soil.

I wrote years ago about the enhanced crop yields, steady profits, and reduced pesticide use and pollution produced by these year-round ground-covering rotations, and Iowa State’s most recent data continue to reflect these benefits. Average corn yields are 2 to 4 percent higher, soybean yields are 10 to 17 percent higher, and profits are similar to corn-soy alone. While cutting herbicide use by as much as 51 percent, the system positively slashed herbicide runoff into streams by as much as 96 percent, and it reduced total nitrogen fertiliz­er application rates by up to 57 percent as well.

Now, a groundbreaking analysis by UCS senior economist Kranti Mulik shows that such a modified system is scalable. Building on Iowa State’s results with additional analysis of soil erosion outcomes and economic impacts, her report, Rotating Crops, Turning Profits: How Diversified Farming Systems Can Help Farmers While Protecting Soil and Preventing Pollution, found that these innovative rotations, paired with no-till practices to keep soil in place, could be imple­mented on millions of acres in Iowa today and expanded to tens of millions more over time. Specifically, she found that:

  • Diverse crop rotations could be adopted over time on 20 to 40 percent of Iowa’s farmland—5 million to 11 mil­lion acres—without changes in crop prices driving farm­ers back to predominantly corn-soy.
  • Soil erosion would be reduced by 88 percent compared with tilled corn-soy, to a sustainable level given natural soil replacement rates.
  • Taxpayers would achieve total annual savings of $124 million to $272 million from reduced surface water cleanup costs and net reductions in heat-trapping gases valued at $111 million to $233 million annually, for a total of $235 million to $505 million in environmental benefits every year.

Although we focused our analysis on Iowa, the results can be generalized throughout the Corn Belt.

So why aren’t Iowa farmers sowing oats?

A few years ago, production of oats in the United States fell to its lowest level since the Civil War. Partly, of course, that’s because most people don’t get around using oat-eating horses anymore. But even since the 1940s, oat production in Iowa has fallen steadily, as this handy graph shows (hat tip to my colleague Andrea Basche, who created it):

The change in crops planted across the state of Iowa from 1940-2012. Closed symbols represent summer annual crops while open symbols represent perennial crops or crops that grow over winter. Alfalfa, barley, hay and oats represented 45 percent of harvested acreage in 1940 and 7 percent in 2012. Source: USDA-NASS,

Struggling farmers need to diversify, and they need help

There’s no agronomic reason Iowa farmers can’t grow crops other than corn and soybeans, they just mostly don’t anymore. Maybe specialization seemed like a good idea at the time, but now farmers in Iowa and other parts of the Midwest are trapped in an endless cycle of corn and soybeans. And it can’t continue. As any financial advisor will tell you, having just a few stocks (or in this case, just a few crops) in your portfolio puts you at increased risk from price swings. And so it is with many farmers, who now rely, to a risky and ultimately unsustainable degree, on corn and soybeans.

This guy likes oats, but pigs would eat them too!

Our friends at the Practical Farmers of Iowa (PFI) are trying to turn that around. For the last few years, PFI’s Sarah Carlson and her colleagues have been working with a small group of pioneering farmers on diversifying crop rotations, including an oat pilot project. They’ve even created a YouTube video series called Rotationally Raised and a dedicated oat-growing tips video to share their experience with other farmers who might want to give diverse rotations a try.

Carlson says that many of the farmers she talks to would like to try adding oats and other crops into their mix, but they need to know they’ll be able to sell them. That’s why PFI is also talking with companies who buy a lot of oats (think cereal makers) about committing to buy Iowa oats in the future. The state’s pork producers could also be encouraged to feed oats to their pigs as a substitute for some of the corn they now buy.

Mulik believes that markets for new crops will expand once there are more oats out there looking for buyers, at lower prices than corn. To paraphrase a famous line from a movie set in Iowa, “If you grow it, they will come.”

Tell Secretary Sonny: Diversify US agriculture!

But we don’t have to wait for markets to catch up. Many farmers who might adopt a modi­fied rotation system right now face challenges including financial and technical barriers as well as crop insurance and credit con­straints. New and expanded federal farm policies are needed to help farmers overcome those barriers and reap the benefits of these systems. Our report recommends some specific policy changes Congress should take up as it reauthorizes the federal farm bill over the next year, and others the USDA could implement in the near term.

And this brings me to the Trump administration’s newly-confirmed Secretary of Agriculture, Sonny Perdue. Perdue has hit the ground running, meeting with farmers at an Iowa town hall and flying over flooded farmland in Arkansas last week, while using his folksy new Twitter handle, @SecretarySonny, to assure farmers that the USDA has their back.

One way the USDA could support farmers in the Midwest and across the country is by supporting smart farming systems—like diverse crop rotations—that offer proven benefits to farmers and the rest of us. Sign our petition today urging him to prioritize healthy farm and food systems.



US-China Relations Set Up to Fail

UCS Blog - All Things Nuclear (text only) -

US Secretary of State Rex Tillerson discusses US-China relations at a US Department of State assembly on 3 May 2017.

In June 1950 US President Harry Truman let North Korea set the course of US—China relations. Sixty-seven years later, with the Korean War still unresolved, President Trump is poised to make the same mistake.

The Road Not Taken

Just after North Korean forces invaded the south, Truman decided to protect the losing side in a Chinese civil war he believed was all but over. The defeated forces of the Republic of China (ROC) had abandoned their capital in Nanjing and fled to the island of Taiwan. The armies of the People’s Republic of China (PRC) were massing for an attack. Truman unequivocally rejected pleas for help from ROC President Chiang Kai-shek and his supporters in the US Congress. But the North Korean invasion changed his mind, and with it the course of modern Asian history.

Had Truman not linked the conflict in Korea to the Chinese civil war by placing the US 7th fleet between the Chinese mainland and Taiwan, the Chinese leadership may have felt less threatened by the US military intervention in Korea. Chinese forces may have continued to prepare to cross the Taiwan Straight instead of crossing the Yalu river. General McArthur might have defeated the north and unified the country. With no rival government in Taibei to occupy its seat at the United Nations, China may have been less inclined to lean towards Moscow or to develop nuclear weapons. And the rapprochement that began with President Nixon’s visit to China in 1972 might have started decades earlier.

It is impossible to know what might have been. But it may be useful to imagine how events in Asia might have unfolded if Harry Truman had not let Kim Il Sung alter US policy on China. It’s a thought experiment that could be especially helpful to the Trump administration, which is planning to hand Kim’s grandson another North Korean veto over improved US—China relations.

Trump’s China Policy Review

U.S. Secretary of State Rex Tillerson recently told a departmental assembly that the new administration was “immediately confronted with a serious situation in North Korea.” A review of US policy, completed with the assistance and support of Secretary of Defense James Mattis, determined the United States should “lean hard” on China’s leaders and “test their willingness to use their influence” to resolve the situation. More importantly, Tillerson and Mattis concluded this was “a good place to start our engagement with China.”

But making the North Korean nuclear weapons program a test case for US engagement with China is unlikely to end well. The US and Chinese governments both want a denuclearized Korean peninsula, but they have irreconcilable differences on how to achieve it. Dialing up the pressure to see if China’s leaders will yield is more likely to diminish the already low level of strategic trust between Washington and Beijing.

Divergent Views of the Korean Woods

The United States wants China to strangle the North Korean economy. China’s leaders don’t believe that will stop North Korea’s nuclear weapons program. It’s a conviction born from personal experience. The United States employed the same strategy against China in the 1950s and it didn’t work. Isolation and intimidation only strengthened China’s resolve to develop nuclear weapons. Chinese officials believe the Koreans will respond the same way.

Instead of creating an economic crisis, which heightens tensions, encourages risk-taking and could lead to war, Chinese leaders believe that North Korean economic stability is more likely to contribute to a peaceful resolution of the situation. They’ve agreed to UN sanctions targeting imports directly related to the development of nuclear and missile technology. But at the same time China has increased bilateral trade and economic aid. China’s leaders may be willing to impose short-term economic costs to signal displeasure, but imposing long-term restrictions designed to cripple North Korea’s economy would be a dramatic departure from current Chinese policy.

A Road to Failure

The new US Secretary of State and his counterpart at the Pentagon do not seem to recognize that there is a principled disagreement between China and the United States on North Korean policy. Tillerson and Mattis appear to interpret Chinese choices that are at odds with their own as evidence of incapacity, unwillingness or bad faith. So they’ve decided to “lean hard into China” to try to push its leaders to adopt and implement US policy preferences instead of their own.

Given the stakes for China, its leaders are likely to keep their own counsel. There is little reason for the Chinese to believe that President Trump and his advisors understand the North Koreans better than they do.

Tillerson told the assembly at State he hopes to set up the next half century of US-China relations. Tying the long-term future of the US—China relationship to a dramatic shift in Chinese policy on North Korea is a prescription for disappointment. Attempting to affect that shift through intimidation and brinkmanship is almost certain to fail.

Truman’s decision to link North Korean behavior to the US relationship with China created decades of misunderstanding and mistrust. Though times have changed and the issues are different the risks of giving North Korea undue influence over the long-term future of US—China relations remain, and deserve more careful consideration.


Advisory Committee Shakeup Targets Independent Science and Scientists

UCS Blog - The Equation (text only) -

Right now, the Trump administration is taking a backdoor approach to putting politics over science. There is a full-on assault afoot to strip away the independence of advisory committees at several government agencies. The reason? A renewed interest in shaping policies to fit particular political positions rather than having a basis in strong science.

On Friday, Scott Pruitt’s EPA failed to renew nine members on the Board of Scientific Counselors (BOSC), the advisory committee that reviews the work of scientists within the EPA’s Office of Research and Development (ORD) on everything from chemical safety to air pollution to fracking. Despite being told that their positions were being renewed, an EPA spokesman has confirmed that the academics may instead be replaced with industry experts who better “understand the impact of regulations on the regulated community.”

Typically BOSC members serve two three-year terms. Four members had finished their second term and were set to leave the committee, but half of the 18 BOSC members were set to begin their second term this month; instead the agency chose not to renew their terms. One of the first non-renewed members to go public, Robert Richardson, told the Washington Post, “I’ve never heard of any circumstance where someone didn’t serve two consecutive terms,” he said, adding that the dismissals gave him “great concern that objective science is being marginalized in this administration.”

Promoting conflicts of interest?

Science advice to agencies should be independent. This helps federal agencies make science-based decisions that keep us safe and healthy. But often this advice from scientists, which is based on objective reviews of the best available science, often doesn’t provide corporations with their ideal policy prescriptions.

Since this administration has illustrated time and time again its willingness to do industry’s bidding, political appointees at agencies are delaying and disrupting the way that these committees are supposed to work: independently and transparently with the public’s best interests at the heart of all evaluations.

To be clear, committee members are selected based on scientific merit, not political positions. According to a 2013 solicitation for committee members, the way that nominees are evaluated is based on the following:

(a) Scientific and/or technical expertise, knowledge, and experience;

(b) availability to serve and willingness to commit time to the committee (approximately one to three meetings per year including both face-to-face meetings and teleconferences);

(c) absence of financial conflicts of interest;

(d) absence of an appearance of a lack of impartiality;

(e) skills working on committees and advisory panels; and

(f) background and experiences that would contribute to the diversity of viewpoints on the committee/subcommittees, e.g., workforce sector; geographical location; social, cultural, and educational backgrounds; and professional affiliations.

Note that members of the committee must have an “absence” of conflicts of interest, not just a minimization of them. This language will be important to watch as the EPA puts out a call for nominations in the coming months to fill the 13 slots now sitting empty on the committee. Independent science advice must be free from undue political or financial pressure. While the advice coming from scientific advisory committees are not the only consideration for policy makers, the relied-upon science must be objective and independent in order to advance the government policies that best protect our health and safety.

EPA spokesperson J.P. Freire told the Washington Post that, “the agency may consider industry experts to serve on the board as long as these appointments do not pose a conflict of interest.”

I know a great way to avoid industry conflicts of interest: Hire independent academic scientists instead. Or maybe just keep the hardworking, well-respected scientists that were already planning on filling those positions for another three years.

Perhaps there’s another reason that Administrator Pruitt is looking for fresh committee members. According to the Washington Post, “Pruitt has been meeting with academics to talk about the matter and putting thought into which areas of investigation warrant attention from the agency’s scientific advisers.” Essentially, the EPA is taking a hard look at where industry experts could be a good fit.

The head of the BOSC’s Air, Climate, and Energy subcommittee, Viney Aneja, is one of the committee members who has not been renewed. What does that mean for this committee whose focus it is to review the work of the EPA ORD’s Air, Climate, and Energy research program? This is likely an area of special concern to Administrator Scott Pruitt, given that he does not acknowledge that climate change is due to human-generated emissions and therefore will be looking for ways to interfere with research efforts that support policies that take action on climate change. This same kind of influence would be troubling for the EPA ORD’s five other research programs: chemical safety for sustainability, human health risk assessment, homeland security research program, safe and sustainable water resources, and sustainable and healthy communities.

EPA spokesperson J.P Freire also said that “We’re not going to rubber-stamp the last administration’s appointees. Instead, they should participate in the same open competitive process as the rest of the applicant pool.” He continued, “This approach is what was always intended for the board, and we’re making a clean break with the last administration’s approach.” But if this was always the intent for the board, why is it only being instituted now, more than 20 years after the committee was established? Why, if these scientists still have expertise that fits the needs of the committee would they need to make this change to the process?

An assault on independent science advice

This current assault on the way that our government seeks science advice from outside experts is not unique to the EPA. The Interior Department is now “reviewing the charter and charge” of more than 200 advisory committees according to agency officials. Administrator Ryan Zinke has postponed the work of these committees in the meantime, including the Bureau of Land Management’s 30 resource advisory committees, the Advisory Committee on Climate Change and Natural Resource Science, and issue-specific panels that take on issues like invasive species and wildlife trafficking.

Earlier in the year, members of Congress introduced the EPA Science Advisory Board (SAB) Reform Act, which seeks to change the requirements for the EPA SAB so as to give industry greater influence while adding extra burdens that make it harder for the committee to meet its charge of providing science advice. I wrote about the potential harms of this bill here.

Following the playbook on political interference in science

This is not the first time that scientific advisory panels have been targeted for political interference in the United States. During the G.W. Bush administration, agency officials subjected nominees to political litmus tests that had no bearing on their expertise and appointed members to committees with serious conflicts of interest. For example, 19 of 26 candidates for an advisory board at the National Institutes of Health’s Fogarty International Center were rejected, three of which because of their views on abortion or because of their public criticism of the president. Three qualified experts were dismissed from a peer review panel at the US Department of Health and Human Services (HHS) for supporting a health standard opposed by the administration. In 2002, HHS placed several individuals with known ties to the paint industry on a lead-poisoning advisory panel, while rejecting highly qualified candidates nominated by HHS scientists. Ultimately the panel did not support lowering the lead-poisoning threshold, despite strong scientific evidence that even very low lead levels harm children.

Some panels have been disbanded altogether because members’ research findings were inconvenient for the administration. In 2003, for example, White House officials abolished a highly distinguished expert committee that advised the National Nuclear Security Administration because its members had published papers on the ineffectiveness of “bunker buster” nuclear weapons, which the administration planned to develop. This type of behavior is antithetical to the way in which science should be used in policy making.

Undermining science, aiding industry

Representative Lamar Smith said of the EPA’s Science Advisory Board (SAB) earlier in the year that, “The EPA routinely stacks this board with friendly scientists who receive millions of dollars in grants from the federal government…The conflict of interest here is clear.” Smith and several other Republicans misunderstand what a conflict of interest is and what independent science advice should be.

A scientific advisory committee should be composed of scientists who are experts in their fields and who are qualified to evaluate scientific research to help agencies meet their missions to protect public health and the environment. Period. One thing advisory committees are not, and should never be, are venues for industry to insert junk science and spread misinformation about science that protects our public health and safety.

This isn’t 1984. You can’t just throw scientific conclusions into the metaphorical memory hole. Likewise, you can’t just discard independent scientists and halt the work of important scientific bodies under the radar without public backlash. We are watching this administration and will stand up for science whenever we see that it is being sidelined in the name of political or industry interests. You can follow along as we document attacks on science by the Trump administration and Congress and look for ways to push back.

Exelon Generation Company (a.k.a. Nuclear Whiners)

UCS Blog - All Things Nuclear (text only) -

The Unit 3 reactor at the Dresden Nuclear Power Station near Morris, Illinois is a boiling water reactor with a Mark I containment design that began operating in 1971. On June 27, 2016, operators manually started the high pressure coolant injection (HPCI) system for a test run required every quarter by the reactor’s operating license. Soon after starting HPCI, alarms sounded in the main control room. The operators shut down the HPCI system and dispatched equipment operators to the HPCI room in the reactor building to investigate the problem.

The equipment operators opened the HPCI room door and saw flames around the HPCI system’s auxiliary oil pump motor and the room filling with smoke. They reported the fire to the control room operators and used a portable extinguisher to put out the fire within three minutes.

Fig. 1 (Source: NRC)

What Broke?

The HPCI system is part of the emergency core cooling systems (ECCS) on boiling water reactors like Dresden Unit 3. The HPCI system is normally in standby mode when the reactor is operating. The HPCI system’s primary purpose is to provide makeup water to the reactor vessel in event that a small-diameter pump connected to the vessel breaks. The rupture of a small-diameter pipe allows cooling water to escape, but maintains the pressure within the reactor vessel too high for the many low pressure ECCS pumps to deliver makeup flow. The HPCI system takes steam produced by the reactor core’s decay heat to spin a turbine connected to a pump. The steam-driven pump transfers water from a large storage tank outside the reactor building into the reactor vessel. The HPCI system can also be used during transients without broken pipes. The HPCI system’s operation can be used by operators to help control the pressure inside the reactor vessel by drawing off the steam being produced by decay heat.

The HPCI auxiliary oil pump is powered by an electric motor. The auxiliary oil pump runs to provide lubricating oil to the HPCI system as the system starts and begins operating. Once the HPCI system is up and running, the auxiliary oil pump is no longer needed. At other boiling water reactors, the auxiliary oil pump is automatically turned off once the HPCI system is up and running—at Dresden, the auxiliary oil pump continues running.

Why the Failure was Reported

On August 25, 2016, Exelon Generation Company (hereafter Exelon) reported the HPCI system problem to the Nuclear Regulatory Commission (NRC). Exelon reported the problem “under 10 CFR 50.73(a)(20(v)(D), ‘Any event or condition that could have prevented the fulfillment of the safety function of structures or systems that are needed to mitigate the consequences of an accident.’”

Why It Broke

Exelon additionally informed the NRC that the HPCI system auxiliary oil pump motor caught fire due to “inadequate control of critical parameters when installing a DC shunt wound motor.” The HPCI system auxiliary oil pump motor had failed in March 2015 during planned maintenance. The failure in 2015 was attributed by Exelon to “inadequate cleaning and inspection of the motor” which allowed carbon dust to accumulate inside the motor.

How the NRC Assessed the Failure

The NRC issued an inspection report on December 5, 2016, with a preliminary white finding for the HPCI system problem. The NRC determined that the repair of HPCI system auxiliary oil pump motor following its failure in March 2015 resulted in the motor receiving higher electrical current than needed for the motor to run. Consequently, when the HPCI system was tested in June 2016, the high electrical current flowing to the auxiliary oil pump motor caused its windings to overheat and catch fire. The NRC determined that the inadequate repair in March 2015 caused the failure in June 2016. The NRC proposed a white finding in its green, white, yellow, and red string of increasing significant findings and gave Exelon ten days to contest that classification.

During a telephone call between the NRC staff and Exelon representatives on December 15, 2016, Exelon “did not contest the characterization of the risk significance of this finding” and declined the option “to discuss this issue in a Regulatory Conference or to provide a written response.” With the proposed white finding seemingly uncontested, the NRC issued the final white finding on February 27, 2017.

Why the NRC Reassessed the Failure

It took the NRC over two months to finalize an uncontested preliminary finding because Exelon essentially contested the preliminary finding, but not in the way used by the rest of the industry and consistent with the NRC’s longstanding procedures over the 17 years that the agency’s Reactor Oversight Process has been in place.

Instead, Exelon mailed a letter dated January 12, 2017, to the NRC requesting that the agency improve the computer models it uses to determine the significance of events.  Exelon whined that NRC’s computer model over-estimated the real risk because it considered only the failure of a standby component to start and the failure causing a running component to stop. Exelon pointed out that the auxiliary oil pump did permit the HPCI system to successfully start during the June 2016 test run and it later catching on fire did not disable the HPCI system. Exelon whined that the NRC’s modeling was “analogous to the situation where the starter motor of a car breaks down after the car is running and then concluding that ‘the car won’t run’ even though it is already running.”

The NRC carefully considered each of Exelon’s whines in its January 12 letter and still concluded that the failure warranted a white finding. So, the agency issued a white finding. With respect to Exelon’s whine that the auxiliary oil pump burned up after the HPCI system was up and running, the NRC reminded the company that the operators shut down the HPCI system in response to the alarms—had it been necessary to restart the HPCI system, the toasted auxiliary oil pump would have prevented it. It is not uncommon for the HPCI system to be automatically shut down (e.g., due to high water level in the reactor vessel) or to be manually shut down (e.g., due to operators restoring the vessel water level to within the prescribed band or responding to a fire alarm in the HPCI room) only to be restarted later during the transient. The NRC’s review determined that their computer model’s treatment of a “failure to restart” would yield results very similar to its treatment of a “failure to start.”

The auxiliary oil pump’s impairment reduced the HPCI system to one and done use. In an actual emergency, one and done might not have cut it—thus, NRC issued the white finding for Exelon’s poor performance that let the auxiliary oil pump literally go up in smoke.

The NRC conducted a public meeting on May 2, 2017, in response to Exelon’s letter. I called into the meeting to see if Exelon’s whines are as shallow and ill-conceived as they appear in print. I admit to being surprised—their whining came across even shallower live than in writing. And I would have bet it impossible after reading, and re-reading, their whiny letter.

What’s With the Whining?

Does Exelon hire whiners, or does the company train people to become whiners?

It’s a moot point because Exelon should stop whining and start performing.

Exelon whined that the NRC failed to recognize or appreciate that the auxiliary oil pump is only needed during startup of the HPCI system. During the June 2016 test run, the HPCI system successfully started and achieved steady-state running before the auxiliary oil pump caught fire. Workers put out the fire before it disabled the HPCI pump. But the NRC’s justification for the final white characterization of the “uncontested” finding explained why those considerations did not change their conclusion. While the auxiliary oil pump did not catch fire until after the HPCI system was successfully started during the June 2016 test run, its becoming toast would have prevented a second start.

Exelon expended considerable effort contesting and re-contesting the “uncontested” white finding. Had Exelon expended a fraction of that effort properly cleaning and inspection the auxiliary oil pump motor, the motor would not have failed in March 2015. Had Exelon expended a fraction of that effort properly setting control parameters when the failed motor was replaced in March 2015, it would not have caught on fire in June 2016. If the motor had not caught on fire in June 2016, the NRC would not even have reached for its box of crayons in December 2016. If the NRC had not reached for its box of crayons, Exelon would not have been whining in January and May 2017 that the green crayon instead of the white one should have been picked.

So, Exelon would be better off if it stopped whining and started performing. And the people living around Exelon’s nuclear plants would be better off, too.

Kids, Scientific Integrity, and What We Can Learn From the Local Science Fair

UCS Blog - The Equation (text only) -

My kids’ school recently had its annual science fair, and what a thing of beauty it was. From catapults to solar stills to randomized trials about yawning, the projects of those elementary and middle school kids remind us what science is all about: Hypothesis, methodology, data, and conclusions. (And maybe the occasional blue marble.)

If only all of our elected leaders got that.

Kids and blue marbles

For the school science fair, my sons chose projects near and dear to my heart. One built a model house to test the efficacy of going from single-pane windows to double-, triple-, or quadruple-. Heat up the house with a light bulb (those old incandescents come in handy after all), then see how quickly the temperature drops off under different situations.

Son #2 explains the finer points of scientific discovery. He does do windows. (Credit: J. Rogers)

My other son tackled desktop carbon capture (‘cause who doesn’t want one of those on their desk?). A pump, a fan, a little sodium hydroxide, and a planet with way too much carbon dioxide in the atmosphere, and voilà: Calcium carbonate, ripe for re-use.

Just to be clear: My kids didn’t pick those projects because I pushed them. I actually tried to talk my older lad out of the CO2 capture effort, and toward something that seemed more doable. But science has a way of grabbing hold of you.

And they both managed beautifully. They each duly came up with a plan, carried it out, checked to see how reality matched up, and documented it all.

Son #1 stands behind his science. All the way. (Credit: J. Rogers)

And there was a lot of that hypothesize-test-assess visible around the school during the science fair. My niece looked at which household cleaners work best. Other projects tested memory, stress balls, or gymnastics chalk. Science traveled on the water (boat design) and took flight (testing parachute shapes or basketball shots). And Science made its way into cyberspace: One student conducted trials to see which house-building materials best resisted exploding “creepers” in Minecraft.

Even established, centuries-old science fell under the scrutiny of the young scientists. My nephew put Sir Isaac Newton himself to the test, with a pumped-air water rocket set-up.

My nephew and the scientific method: It’s not rocket science. Er, unless it is. (Credit: J. Rogers)

And then there were those really early-career types leaping into science. One first-grader assessed the influence of marble color on spoon catapulting (whereby a spoon is put on the edge of a table, loaded with a marble, and launched in a parabolic arc with an abrupt downward hit to the handle). Her hypothesis was that the blue marble would travel farther than the white one “because blue is my favorite color.” When the data bore that out, her conclusion was that blue went farther “because I wanted it to.”

She may not have all the particulars quite worked out, but her love of science shone through anyway, even at her young age. And her honesty was refreshing.

Next step: calcium carbonate. Fuel the fun,… then clean it up afterward. (Credit: J. Rogers)

As for the grown-ups…

That kind of data manipulation is cute in a first grader trying out formal science for the first time. It’s a lot less so in politicians or other decision makers looking to promote a patently anti-science agenda, or to throw science overboard in favor of a political agenda or short-sighted profits.

As a country, we’ve got work to do to make sure that science retains its proper and necessary role in progress and decision making. Getting science right matters, for all of us, down to the smallest newborn.

Alas, lately the work of the Union of Concerned Scientists on scientific integrity has seemed to be needed more than ever. From bad science on efficient cars, to bad guidance on good nutrition, to decidedly unscientific approaches to regulation, and more, this war on facts is enough to drive scientists out of the lab and onto the streets.

That’s why so many of us—scientists and non-scientists—were marching for science a couple of weekends ago, and for climate science and action this past Saturday.

Science rules

So we stand up for science, in all things.

Feet ready to stand/walk/run/leap for science,… Whatever it takes. (Credit: J. Rogers)

And if you need inspiration, just look to the next generation. Borrow not just their creativity and enthusiasm, but also their respect for the process of scientific discovery and attention to data.

On blue marbles, the Big Blue Marble, and everything in-between and beyond.

As for my sons: They were both among the kids/projects picked to go on to represent the school at the regional science fair, and then potentially to go on to states. Whatever happens, though, they’re already winners, when they follow the scientific method down a path to discovery and knowledge. Really.

Now to get all the grown-ups on board, too.

What’s Next After the Peoples Climate March? Riding the Momentum and Bringing It Home

UCS Blog - The Equation (text only) -

Were you at the 2017 Peoples Climate March? Or one of its sister marches? If so—or if you just caught the story on the news—you know that something big just happened.

In 90+ degree heat, just a week after an overlapping science march, an estimated 200,000 people turned out in Washington, DC to show their anger and resolve for US climate action. Tens of thousands more took part at an additional 370 marches across the country.

So now what?

The climate justice block of the People’s Climate March gets ready to march. Credit: UCS/Audrey Eyring

Marchers will long remember the heat that day. It forced many people to the sidelines for shade, rest, and hydration. I rested there about halfway through with my group of 20 friends and family, while my 72-year old dad tried to recover from heat exhaustion and my kids blew bubbles over the streaming crowd. He had taken the sleepless overnight bus from Maine so he could travel with his activist friends. When my aunt tried to get him to leave he asked to sit a while and “watch his people” march by. In all our glory.

For people like him who have been at this a long time, and for those new to the fight who need a reason to stay in it, the march is a chance to see a whole world of people of common cause, be reminded of our strength, and return to our work with new resolve. In this political moment, April 29, 2017 was a unique gift.

These two boys traveled from Massachusetts for the march. My son on the left. His friend’s sign reads: Climate Warrior in the Making. Photo: Erika Spanger-Siegfried

By all accounts, the march was a success. But now we’re back home, rested, and wondering: what now?

There are paths from the march to the kind of climate wins we so badly need—and UCS is working hard to identify and prioritize which high-impact actions should come next. But there’s one tactic that all of us should employ as we return to our daily lives in the wake of this historic march: We need to bring it home. The most egregious actions may be coming from DC, but many of the fights to block them must play out locally.

We need to take every opportunity to be heard, to say loud and clear that undercutting climate science and sidelining climate solutions is bad for our country, our economy, our community, and our families.

We need to speak up in our local papers, at public events, with opinion leaders and influential communities, with our neighbors, and with our elected officials.

We need to remind everybody that America runs on science, that America depends on climate solutions, and that we will keep fighting until we get there.

Momentum and resolve: It’s not the moment, it’s the movement.

Like so many other groups, UCS was out in force on Saturday (we were the ones rolling the giant chalkboard). And like so many others, we were there not only because of our history of work on climate change, but also because in this moment, with climate science and policy under assault, the climate movement can not afford to be back on its heels; we need to regain our momentum and show our resolve.

The climate fortunes have swung far, too far, and in a terrible direction. Now it’s time to come roaring back.

The crowds that mustered for this march showed that the reversal is under way and gaining momentum.

The Peoples March for Climate, Jobs, and Justice” succeeded in bringing together one of the broadest coalitions in recent memory: scientists, environmental justice advocates, indigenous groups, youth leaders, faith groups, policy advocates, labor groups, business leaders, green groups, human rights groups, politicians, and more… and at the fore of it all, everyday folks from communities on the front lines of climate change. Check out these beautiful photos. It’s a big tent, with a deliberately broad agenda: in the motto “We Resist, We Build, We Rise,” there’s something for everyone to rally behind.

And as we find our place in this movement, we—as individuals and organizations—need to clarify our role, take it up, and live into it with resolve.

One role in the movement: Standing up for science

For many of us, that role is standing up for science.

There are some key tactics that the current administration is deploying as part of its strategy to dismantle existing climate policies and advance science-free policy making, to the non-coincidental benefit of fossil fuel interests. These include anti-science rhetoric, staffing federal agencies with anti-science administrators, denying or misleading on science, and pursuing deep budget cuts to vital science-based functions of the federal government.

One role is to stop them wherever we can.

UCS—including our science network, members, and half a million activists—is keeping a close eye as a watchdog for anti-science policy actions and budget cuts at the federal level, and activating measures to push back on assaults by Congress and the administration. Even with federal leadership as egregiously anti-science as this, this is something we and this movement know how to do.

But none of us came here to play defense for four years. And as our movement gains momentum, opportunities will open up for us to combine our defensive stance with an offensive one.

This march was planned many months ago to mark the 100th day in office of the new president. Many of us anticipated pushing for stronger climate policies under a climate-friendly president. No one imagined 100 days quite like these, but here we are. And right now the climate movement—and the larger “resistance” it is part of—have incredible energy and, let’s face it, anger to channel. So let us start taking the fight to them.

(function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); = id; js.src = "//"; fjs.parentNode.insertBefore(js, fjs);}(document, 'script', 'facebook-jssdk'));

We resist. We build. We rise.

This is what it looks like when 200,000 people march together for climate, jobs, and justice. It is absolutely beautiful. Together, we will chart a different path forward: away from Trump’s agenda of a cruel, polluted, and divided country, and towards a clean energy economy that works for everyone.

Posted by People's Climate Movement on Sunday, April 30, 2017

Going on offense: Showing up, speaking out, making it hurt

After months spent largely on defense, strategies are gelling to channel the national climate movement into dispersed, localized climate action. Many of the attacks on science originate from DC, but most must be fought locally. So here are some thoughts:

  • Going after attacks on science – Denying climate science and denigrating climate scientists is a tactic that has had too many users and too few costs for those who use it. It’s time for constituents to raise the political costs for elected leaders, and make it hurt, politically, when they deploy anti-science rhetoric or actions.
  • Fighting silence (and complicity) – Science, facts, and reason form the very foundation of a strong democracy—indeed, of America itself. They protect our health. They keep our communities, families, and children safe. They safeguard our future. And right now, too many elected leaders are silent as science, facts, and reason are dragged through the mud on their watch. It’s time to start calling out the spinelessness of these elected officials and hold them accountable when they let such threats to our democracy go unchallenged. Looking back, no one will parse the difference between silence and complicity. History has its eyes on them—and we’ve got their offices on speed dial.
  • Standing with each other – To “Resist, Build and Rise” we need to strengthen our movement even as it grows. A colleague’s favorite quote from the PCM organizing experience was “If you’re not arguing, your coalition isn’t big enough.” Creating space to grow—for groups to see themselves in the movement, step in, find a voice and a role—is vital. For us to hold that growing movement together, though, and see it grow stronger, we need to be good allies. Individuals or groups need to find and take opportunities to stand with each other and elevate those voices that too often get drowned out—and we need to do so not just on march day.

Resistance groups like Indivisible and the Town Hall Project have captured specific tactics for going on offense locally—strategies and tactics that are relevant for and essential to the climate fight. We in the climate movement can label the nesting dolls however we like, depending on our priority issues; the point is the fights are often one and the same, and so many of the tactics can be as well. Here are some:

  • Contact your legislators and keep on contacting. Phone calls work. If you can’t get through to their Washington, DC, office, try your local office. It’s also effective to send postcards.
  • Plan to visit your legislators during a congressional recess. You can find out when your legislators are home for work periods by checking here for the Senate and here for the House.
  • When planning your visit, think about people you know who could be powerful persuaders, especially friends who are from another party, who are medical or public health professionals, or who represent business, faith, or veterans groups.
  • Watch your local news and read your local paper. Respond quickly to stories, articles, or opinion pieces that cast doubt on climate science or downplay the need for climate action. Tips for responding to the media can be found here.
  • Show up at town halls held in your district or in neighboring cities and be heard. You can find information on upcoming town halls here. And remember, speak the truth, even if your voice shakes.
  • You can of course join a UCS mailing list, watch out for action alerts from us, and make time to take the action suggested. Whenever possible, go beyond the one-click action: Personalize your email to your legislator or your comment to a federal agency. Or show up at a rally outside your legislator’s office.
  • And stay in touch with the Peoples Climate Movement and with UCS as we roll out tactics and tools for getting this job done.
Until next time….

During the march, my seven-year old struggled with the heat, especially during the long stretch before the crowds began moving. He was sporting a T-shirt hand-painted for the march. (Al Gore, god love him, had been kind enough to praise his choice of text, “Small But Mighty,” and ignore the giant chocolate ice cream stain running neck to waist.) He had been chanting to himself in the bathroom “when I say climate, you say justice!” He had made his own sign and used at least 300 pieces of tape to secure it to the pole. He was ready!

Only now he was done. Spent. Ready to go home.

Sound familiar?

We took turns blowing up an enormous, oversized beach ball and when it was done, he launched it into the sea of people. It was volleyed around our area for a while, but as warned, soon bounced out of sight, the sound of delighted marchers wafting back to us. Mid-way through the march, I hoisted him up to see it again, far in the distance, this time being sent high into the air by many hands on a painted parachute. He glowed. Just like I did, two-stepping with strangers to a brass band. Just like my dad did watching us all stream by. Maybe just like you did at some point on April 29.

And with that, one 7-year old was refreshed and ready for action. Strangers celebrating the simplest joys, even while engaged in one of the strongest acts of protest: it’s a gift we give each other that renews and stays with us.

Thank you, people of the Peoples Climate March. Until next time.

The Guardians of the Future, the group my family marched in, make their way through the heat. Photo: UCS/Audrey Eyring


New Mexico’s Largest Electricity Provider Proposes Going 100% Coal-Free

UCS Blog - The Equation (text only) -

Late last month, the Public Service Company of New Mexico (PNM) issued a landmark finding. After conducting a routine assessment of future power supply scenarios, the utility made an anything-but-routine conclusion: the best version of its future self, PNM declared, was entirely coal-free.

For a utility that’s been powering its customers with coal for half a century—and which has aggressively defended its need for coal to regulators as recently as a year ago—the finding marks a major step in the right direction, and should be celebrated for the gains it promises New Mexicans in the areas of public health, affordability, and the environment.

Still, there’s more work to be done. PNM’s vision of the future leaves viable renewable resources and energy efficiency opportunities on the table, and leans too heavily on natural gas—a path that puts customers at risk of bearing the high costs of overreliance.

PNM must prioritize resources that will best serve the people of New Mexico now and in the future, which means winding down coal alongside natural gas, all while doubling down on its commitment to energy efficiency and renewables.

What is a ‘Integrated Resource Plan?’

PNM reported this coal-free finding in its recently released draft Integrated Resource Plan, or IRP. The utility is required to issue an IRP every three years, in which it examines a wide range of future scenarios and determines the power supply portfolio that most cost-effectively meets its expected needs over the next 20 years. Or in other words, PNM takes a look at the future and checks to see if it’s on track.

The utility’s findings are informed by stakeholder engagement processes, but they receive final buy-in from the Public Regulation Commission of New Mexico (PRC). If the PRC approves the proposed IRP, then PNM moves forward in accordance with the approved plan, including a required four-year action plan intended to steer near-term decision-making in line with the longer-term views.

IRPs are required in many states around the country. When done well, they provide a valuable opportunity for cooperative engagement on long term planning, allowing the utility, stakeholders, and regulators to all take part in establishing a shared vision. When done poorly, IRPs provide cover for utilities to use inputs and assumptions that support approaches that tend to best serve the utility, not the customers. The risk of manipulation of an IRP makes stakeholder engagement, utility oversight, and modeling transparency absolutely critical components of a robust process.

The good: PNM gets comfortable with the idea of a portfolio without coal

2017 and 2025 energy shares from PNM’s proposed most cost-effective portfolio (MCEP). Credit: PNM Draft IRP (April 2017).

There’s a lot to like in PNM’s draft IRP. For starters, the utility discusses a future without coal with an ease and confidence not previously seen (and frankly, nearly impossible to imagine even recently). PNM now proposes following the planned retirement of San Juan Generating Station (SJGS) Units 2 and 3 at the end of 2017 with the closure of the remaining Units 1 and 4 in 2022, and the abandonment of their share of capacity at the Four Corners Power Plant in 2031, all following the expiration of current fuel supply contracts.

In return, the utility proposes adding more renewables, continuing to pursue energy efficiency, and adding natural gas peaking units.

In discussing the proposed most cost-effective portfolio (MCEP), PNM highlights the improved flexibility of the system, which the utility states will leave it better prepared to support future uses of the grid. This is in contrast with an alternative portfolio that considered the continued use of coal as a baseload resource. In that case, PNM warns that “as PNM and PNM’s customers add more renewable energy, there will be less need for traditional generation year over year,” thereby progressively lessening the need for large-scale, continuously operating coal plants.

Further, PNM notes that “continuing SJGS also subjects PNM customers to risks of higher costs associated with possible future environmental regulations given the continued reliance on coal-fired generation.” These risks include future climate, air, water, and waste environmental regulations, all of which are reduced in the coal-free scenario as compared with the continued-coal case.

Finally, as part of its four-year action plan, PNM flagged the need to add transmission capacity that would allow the delivery of wind power from the resource-abundant eastern part of the state to the load-heavy central and western parts of the state. Given the low cost and incredible abundance of wind power in eastern New Mexico, it is critical that PNM secure the ability to provide its customers with access to these benefits.

The bad: PNM overestimates natural gas, and underestimates renewables

For all of the promising changes proposed in PNM’s IRP, there’s still a lot of the same. Namely, the utility consistently underestimates the benefits and abilities of renewable resources, and consistently overestimates the benefits and abilities of natural gas.

Wind turbines generate large amounts of clean energy at the New Mexico Wind Energy Center. Credit: Sandia National Laboratories.

In the proposed portfolio, PNM reports that it plans to replace all retired coal generation with a mix of renewables, quick-start natural gas peaking capacity, and (potentially) energy storage. What the utility buries is just how much natural gas it’s proposing to add: over 450 MW.

PNM says the gas plants are required to back-up renewables, but we know the grid can handle far higher levels than what’s being proposed. At the same time, the overwhelming majority of the renewable energy that PNM is proposing to bring online is slated for use by a growing clean-energy-consuming data center. If the state’s tremendous solar power resource is good enough for Facebook, then it should be good enough for all New Mexicans.

Finally, PNM plays it safe when it comes to energy efficiency and demand response, expecting investments in line with mandated requirements, and little more. The utility said that part of its four-year action plan is to more closely consider the opportunities provided by such resources, but without including them here, PNM fails to account for the far cheaper alternative they present as compared with natural gas peaking plants. This must be addressed. Energy efficiency and demand response have been repeatedly shown to provide cost-effective, flexible solutions, time and time again.

What comes next

UCS will be working hard to ensure that PNM continues to move in the right direction, including by prioritizing the development of resources that will best benefit New Mexicans now and in the future. The utility is taking a significant step toward a clean energy economy with the phasing out of coal resources, but needs to do more to elevate renewables and efficiency while minimizing its growing reliance on natural gas.

As it goes about this significant transition, it’s also critical that PNM deeply engage with impacted communities on how best to support workers and local economies as they adjust to the changes ahead.

To finalize the proposed IRP, PNM will be accepting public comment through May 23, and then incorporating comments and making changes until it submits its final plan to the PRC on July 3. Here are resources to dig in on more details, as well as the place to ask questions and submit comments on PNM’s draft proposal.

From Soup to Nuts: Science-based Recommendations for FDA’s “Healthy” Label

UCS Blog - The Equation (text only) -

If you wanted to offer your two cents on how the US Food and Drug Administration (FDA) defines “healthy” food, you’ll have to keep those pennies in your pocket for now. The public comment period closed on April 26, and the people—more than a thousand in total—have spoken.

The comments represent a diverse range of perspectives. Some are purist, suggesting that all artificial colors or flavors, preservatives, and genetically engineered ingredients be excluded from foods bearing the “healthy” label. Others raise concerns about consumer (mis)interpretations of “healthy” and related terms, and recommend that its use on product labels be disallowed. Still others—primarily those representing various sectors of the food industry—advocate for flexibility in the regulations to accommodate existing products or provide adequate time for product reformulation.

As for UCS?

We pursued a science-based path to food-based criteria, emphasizing the importance of food groups in healthy dietary patterns, while also supporting limits for sugar, sodium, and fat. Some of these were no-brainers, and some were, quite frankly, tough nuts to crack. Here’s where the science steered us.

Food-based criteria are a must

Any food item labeled “healthy” should contain a substantial proportion of one or more health-promoting foods. We chose to define “health-promoting foods” generally as vegetables, fruits, whole grains, dairy (including nutritionally equivalent dairy substitutes), and protein foods. These categories largely reflect the Dietary Guidelines for Americans Key Recommendations for Healthy Eating Patterns. We also identified some specific foods that should be excluded from the healthy label: fruit juice, processed meat, and red meat. Fruit juice has a higher glycemic index than whole fruit and lacks equivalent fiber content, and is associated with a greater risk of developing type 2 diabetes. The World Health Organization’s classification of processed meat as carcinogenic to humans, and red meat as probably carcinogenic, provides the basis of their exclusion.

Photo: Lea Aharonovitch/CC BY SA (Flickr)

Establishing what constitutes a “substantial amount or proportion” of a health-promoting food is considerably more difficult. While there is some precedent for evaluating the healthfulness of foods (the Environmental Working Group Food Scores and the United Kingdom Department of Health’s Nutrient Profiling Model are good places to start), we lack substantive research to help us identify an amount that strikes the ideal balance between potential health benefits and practicality. Of course, the more health-promoting foods like fruits, vegetables, and whole grains we eat, the better; but a useful recommendation must also consider the full range of products that line our grocery store shelves and the habits and preferences of the people who buy them. Ultimately, should the FDA decide to adopt this method of classifying foods, it will need research that addresses this question.

A “healthy” food should be low in ingredients and nutrients associated with clear health risks

The inclusion of added-sugar limits in the FDA definition of “healthy” is long overdue. The 2015 Dietary Guidelines for Americans recommend that calories from added sugars contribute no more than 10 percent of total calories, while the American Heart Association limits calories from added sugar to less than seven percent of total calories for moderately active adults, and recommends that children under two avoid added sugar altogether. In keeping with the dietary guidelines, we propose that added sugar contributes no more than 10 percent of calories to a food labeled “healthy,” with greater potential health benefits offered by further reductions.

While the current definition of “healthy” identifies sodium limits for foods, research suggests that many Americans need more help staying below recommended daily levels. The current mean sodium intake in the US is 3,440 milligrams per day—that’s almost 150 percent higher than recommended intake of 2,300 mg per day. Research consistently demonstrates a strong relationship between sodium intake and risk of heart disease, which is the leading cause of death in the United States.

Given that processed and commercially prepared foods provide about 75 percent of our total sodium intake, it’s important that the FDA take this opportunity to set adequate sodium goals for packaged foods. While the available science does not point to one optimal sodium threshold for food items or prepared meals, we can confidently say this: sodium limits on food items should be reduced to help Americans meet daily sodium goals. And the forthcoming —currently being drafted by the FDA to help Americans meet sodium targets within 10 years—should inform these reductions with evidence-based recommendations.

Current science suggests the FDA reconsider limits on total fat

As a nation, we are slowly coming to our senses after a decades-long low-fat frenzy, as science has given us a much greater understanding of the role fat actually plays in chronic disease. We now know that the type of fat we eat may influence our health more than the amount. Research consistently shows that replacing saturated fats with unsaturated fats in the diet leads to lower total cholesterol (particularly the LDL, or “bad” variety) in the blood. The existing FDA definition of “healthy” limits total fat content but doesn’t distinguish between types—meaning heart-healthy foods like almonds don’t make the cut. To address this, we propose that foods with fat content higher than current allowable levels may still bear a “healthy” label if the majority of the fat is poly- or monounsaturated. (The catch? The fats need to come from one of the health-promoting food groups named above. Adding canola oil to cookies doesn’t count.)

Food labels are important, but they aren’t enough

With this rulemaking, the FDA has an opportunity to bring its “healthy” claim into better alignment with the latest scientific findings about health-promoting foods. But there are limitations to what even the best food label can achieve. Consumers may interpret the claim in different ways, or may find themselves more influenced by price or package design. Even in the best of circumstances, a nutrient content claim generally conveys information about a given food, but it doesn’t provide the context of a balanced and healthy diet.

That’s why public investment in nutrition programs—like the Supplemental Nutrition Assistance Program (SNAP), National School Lunch Program (NSLP), and Food Insecurity Nutrition Incentive (FINI) grant program—is so important when it comes to changing the tide of Americans’ diets and our nation’s health. As the FDA begins to build their definition of “healthy,” we continue our work in defending these programs in the federal budget and upcoming farm bill to ensure that, for all Americans, “healthy” can be a reality.


Photo: USDA

Oregon’s Clean Fuels Program Off to a Great Start

UCS Blog - The Equation (text only) -

Oregon’s Clean Fuels Program (CFP) was initially authorized by the legislature in 2009, with subsequent legislation in 2015 allowing the Oregon Department of Environmental Quality (DEQ) to fully implement the program in 2016.  The program’s goals are to foster the development of an in-state market for cleaner fuel by requiring that transportation fuels used in Oregon get steadily less-polluting over the next decade. The program requires average life cycle global warming emissions per unit of energy in transportation fuels to decline by 10% by 2025 compared to 2015.

Oregon’s CFP completed a very successful first year, but it remains under attack, so it’s a great time to review how the policy works, the results of its first year, and its prospects for the future.

The Clean Fuels Program creates a steadily growing market for clean fuels

Transportation is the largest source of global warming pollution in Oregon, and the overwhelming majority of transportation emissions come from petroleum-based fuels like gasoline and diesel.

Making a transition to a low carbon transportation system will take several decades, and will require a systemic transformation of transportation systems, vehicles and the fuels used to power them.  The Clean Fuels Program focuses on the fuels, ensuring that the market for clean fuels grows steadily year after year.  This assurance is critical to support investment in new fuels.

Getting clean fuels production and distribution up to commercial scale in the next decade is critical to accelerating the transition to clean transportation, and this early market signal to invest and innovate is even more powerful over the long term than the significant reduction in pollution the policy will deliver over the next few years.

The Clean Fuels Program protects clean fuels producers and all fuel consumers from volatile oil prices

Markets for transportation fuels are highly unstable, with retail gasoline prices in Oregon swinging back and forth in the last decade, from less than $2/gallon to more than $4/gallon.  The instability is a problem for drivers, but it also makes it very difficult for new cleaner fuels to get a foothold, since a clean fuel that is very attractive competing against $4/gallon gasoline may struggle at low prices.

The Clean Fuels Program assures fuel producers that the market for the clean fuels will grow steadily, protected from changes in global oil prices beyond their control, so clean fuel producers can focus on competing against other clean fuel producers.

How does the Clean Fuels Program work?

The Clean Fuel Program requires that large companies importing and distributing transportation fuel in Oregon, mostly gasoline and diesel, act to reduce the emissions from the fuels they sell by 10% per unit of energy.

Unlike Federal biofuels policy, the CFP does not set specific targets for ethanol, biodiesel, natural gas, or any other alternative fuel.  Instead the CFP requires that the average carbon intensity of fuels meets a gradually declining target.  Fuels that are cleaner than the target generates credits, while more polluting fuels generate deficits.

At the end of the year, fuel providers need to settle with the Department of Environmental Quality (DEQ), turning in enough credits to cover their deficits.  The CFP also has several flexibility mechanisms built in, including allowing credit trading between parties selling clean fuels and parties selling more polluting fuels, and allowing fuel sellers to generate extra credits early and save them for later (called banking).

What is Carbon Intensity?

The CFP regulates the “carbon intensity” of fuels, which is a measurement of global warming emissions per unit of energy in the fuel. This allows all fuels—whether gasoline, diesel, ethanol, biodiesel, natural gas, or electricity—to be compared accurately.

In measuring the carbon intensity of fuels, the CFP measures each fuel’s life cycle emissions, which accounts for not only the emissions generated by a vehicle when using a given fuel, but also the emissions that come from producing and transporting the fuel. For example, about a quarter of global warming emissions associated with using gasoline come from extracting and refining the oil to make the gasoline.

Emissions associated with biofuels depend greatly on whether they are made from corn, soybean oil, used cooking oil or biomethane collected at landfills, as well as how the fuel is produced. Electric vehicles produce no tailpipe emissions, so the life cycle emissions of electricity depend primarily on how the electricity is generated (whether from fossil fuels or renewable sources such as wind and solar). See my recent report, Fueling a Clean Transportation Future, for much more information about the future of fuels, especially gasoline, ethanol, and electricity.

The Clean Fuels Program is off to a good start

2016 was the first year of the CFP, and so far, the program is off to a good start. Fuel producers have been registering, and establishing the Carbon Intensity of their fuels, and for the first three quarters, credit generation (from selling fuels cleaner than the target) significantly exceeded deficit generation.  This means that regulated parties are entering the next year with a buffer of banked credits they can use later if necessary.

In the first few quarters, most of the credits were generated from ethanol and biodiesel.  These fuels are already part of the Oregon Fuels mix, blended into gasoline and diesel to satisfy state and federal requirements, but the CFP provides an incentive for fuel blenders to use more of these cleaner fuels and to seek out the least polluting sources of these biofuels, which provide more credits per gallon.

In subsequent years, other fuels, including biomethane and electricity, will play a growing role, but the procedures to credit some of these fuels are still being finalized and it will also take time for fuel buyers to react to the market signals from the CFP.

Clean Fuels Program credit data from Oregon DEQ

Lessons learned from California’s Low Carbon Fuel Standard

California got started with clean fuels policy a little earlier than Oregon, with a closely related policy called the Low Carbon Fuel Standard (LCFS) which went into effect in 2010 and requires a 10% reduction in average carbon intensity by 2020.

Data from the first five years of the LCFS provides a hint of what Oregon can expect as the Clean Fuel Program progresses.  Like Oregon, early years relied mostly on alternative fuels that were already well established in the marketplace, ethanol and natural gas.  But the growth in alternative fuel use encouraged by the LCFS came from other fuels, especially biodiesel, renewable diesel and biomethane.

The LCFS also provided more credits for cleaner fuels, especially those made from wastes such as biodiesel and renewable diesel made from used cooking oil and animal fat, and biomethane captured at landfills.  The larger benefits of these fuels is reflected in the fact that their share of credit generation is larger than their share of alternative fuel volume.


Credit data from California Air Resources Board Data Dashboard

Electricity and the Clean Fuels Program

The truly clean transportation system we need will have fewer internal combustion engines running on petroleum, and more electric vehicles running on non-polluting renewable sources of electricity.  The CFP can accelerate this transition by ensuring that the low carbon benefits of  electricity lowers the cost of operating electric vehicles.

When California’s LCFS got started in 2010 there were almost no electric vehicles, but by 2015 EVs were generating 6% of the credits.  Transit agencies running electric buses generated some of these credits, which they sold to oil companies and others who needed them to offset pollution for gasoline and diesel.

The value of these credits makes it easier for transit agencies to go electric, which also has important health benefits for communities in which these buses operate.  Households with electric cars also benefit as utilities have set up rebate programs funded by LCFS credits – PG&E is has a $500 clean fuel rebate program – which makes owning an EV even more attractive.

Managing the Clean Fuels Program

In 2017, DEQ is undertaking rulemaking  to implement a few important policy improvements. These include establishing procedures for crediting for the use of electricity as a transportation fuel, and implementing a cost containment mechanism to clarify what steps will be taken in the unlikely event of a shortage of clean fuels.  To assist them in this process, DEQ convened an advisory committee of stakeholders representing oil companies, clean fuel producers, environmental groups, the AAA, truckers, and others who are meeting seven times between November 2016 and June 2017.

I have been representing the Union of Concerned Scientists on this committee.  This process gives all parties the opportunities to share their concerns, and weigh in on proposed solutions so that DEQ can put together a well-considered set of program enhancements to take to the Environmental Quality Commission later this year.

The transportation system has many moving parts, and will require a suite of policies

Transportation is not just the largest source of Oregon’s climate emissions, it is also deeply integrated into people’s lives and commerce.

Ensuring the system serves Oregon well will require ongoing investment in roads, bridges, transit, facilities for bikes and pedestrians.  Finding sustainable, equitable means to fund these many priorities is critically important, and should not be considered as an alternative to supporting a transition to cleaner fuels.

Over time clean fuels, especially renewable electricity, will get steadily less expensive, and moving to these in-state sources of transportation fuel will protect drivers from the unpredictable price volatility of gasoline and diesel, which are influenced primarily by global oil prices over which Oregon has very little control. Getting started on this transition away from oil will have some costs, but with appropriate measures to manage these costs, this is a very smart investment in Oregon’s future.

Changing the law is not necessary or helpful at this time

Despite the ample evidence that the Clean Fuels Program is off to a great start, some critics of the policy in the Oregon legislature have been proposing legislation that would dramatically change the rules of the program, and if history is a guide, they may try to undermine the policy in negotiations over funding much needed transportation infrastructure investments.  This is not a smart way to move forward on either clean fuels or transportation funding.  Oregon needs to cut emissions, and it needs to make smart investments in physical infrastructure; bills pitting these goals against each other are short-sighted and counterproductive.

Clean Fuels Policies and Carbon Pricing work together

The Clean Fuels Program is focused on cleaning up transportation fuels, but while transportation fuels are important, other climate policies are also necessary to meet climate goals.  Putting an economy-wide price on global warming emissions, either through a cap-and-trade program or a carbon tax, helps integrate the costs of climate change into the cost of doing business.

In the transportation sector, carbon pricing helps ensure that the costs of pollution from fossil fuels—and the value of low carbon technologies—are better reflected in decisions fuel providers make about what fuels to produce, as well as the decisions consumers make about what cars to buy.  However, a carbon price alone is not enough to decarbonize our transportation system over the next few decades.

Typical carbon prices —which translate to pennies per gallon in increased fuel cost—cannot adequately motivate investments in innovative cleaner fuels. That’s why it is important to have policies in place to limit heat-trapping emissions from fuels directly. The Clean Fuels Program facilitates research, development, and deployment of transformational low-carbon technologies.  For more information, see our fact sheet on how California’s carbon pricing and LCFS complement one another.

State leadership on climate is more important than ever

States have always been important laboratories for democracy, but with the current administration in Washington D.C. actively undermining climate progress, states are an essential bulwark against backsliding.  Policies like the Clean Fuels Program ensure that the market for innovative clean fuels needed to address climate change continue to grow, even in the absence of reliable federal support.

By working together with neighboring states and provinces in the Pacific Coast Collaborative, Oregon can maintain momentum on emerging clean technologies for transportation and other climate goals.  Moreover, by investing in the future, Oregon can keep its transportation system moving forward, even if Washington D.C. is trying to slam the brakes on clean energy and go back to the fossil fuels of the last century.

US Needs More Options than Yucca Mountain for Nuclear Waste

UCS Blog - All Things Nuclear (text only) -

On Wednesday, I testified at a hearing of the Environment Subcommittee of the House Energy and Commerce Committee. The hearing focused on the discussion draft of a bill entitled “The Nuclear Waste Policy Amendments Act of 2017.”

Yucca Mountain (Source: White House)

The draft bill’s primary objective is to revive the program to build a geologic repository at the Yucca Mountain site in Nevada for spent nuclear fuel and other high-level radioactive wastes. The Obama administration cancelled the program in 2009, calling it “unworkable,” and the state of Nevada is bitterly opposed to it, but Yucca Mountain still has devoted advocates in Congress, including the chairman of the subcommittee, John Shimkus (R-IL).

UCS supports the need for a geologic repository for nuclear waste in the United States but doesn’t have a position on the suitability of the Yucca Mountain site. We don’t have the scientific expertise needed to make that judgement.

However, in my testimony, I expressed several concerns about the draft bill, including its focus on locating a repository only at Yucca Mountain and its proposal to weaken the NRC’s review standards for changes to repository design.

UCS believes that rigorous science must underlie the choice of any geologic repository, and that the US needs options in addition to Yucca Mountain, which has many unresolved safety issues. In addition, we believe that any legislation that revises the Nuclear Waste Policy Act must be comprehensive and include measures to enhance the safety and security of spent fuel at reactor sites—where it will be for at least several more decades. For example, we think it is essential to speed up the transfer of spent fuel from pools to dry storage casks.

Oregon Legislature Can Boost Electric Car Sales

UCS Blog - The Equation (text only) -

The Oregon legislature has the opportunity to boost electric vehicle sales in the state and deliver benefits to all Oregonians by passing pending legislation for electric vehicle incentives in House Bill 2704.

Electric vehicles (EVs) are a critical solution to cutting oil use, improving air quality, and reducing global warming pollution. EVs are also a better choice for many Oregon drivers, offering fuel savings and often a better driving experience compared to a gasoline car.

Driving on electricity is cheaper than driving on gasoline for most people, even with today’s lower gasoline prices. Based on Northwest gasoline prices in 2016, we found that driving the average new gasoline car (29 mpg) for a year (11,350 miles) cost $949 in Oregon. Driving that same distance on electricity cost an average of $363 in the state.*

Given the volatility of gasoline prices, using electricity as a fuel also means more stable and predictable refueling costs for the years ahead. And since Oregon lacks oil production and refining, switching away from petroleum can keep more money in the local economy.

EVs in Oregon also have environmental benefits. The average EV on Oregon’s electric power mix produces fewer global warming emissions than any gasoline-powered vehicles on the road—equivalent to a 75 mpg gasoline car, according to UCS analysis in 2015. As Oregon continues to transition to cleaner sources of electricity (thanks to last year’s coal to clean bill) the climate advantages of EVs will only increase.

Despite these advantages, EVs still face barriers to their adoption, so policies are needed to make EVs available and affordable for more average Oregonians. A consumer incentive for zero-emission and plug-in hybrid electric vehicles, as proposed by HB 2704, will help motivate prospective car buyers to investigate electric drive options and is also an important signal from the state in support of needed technologies.

States with EV incentives lead the nation in EV sales. For example, in California, more than 90 percent of surveyed EV rebate recipients said that the state’s rebate was important to their decision to buy an EV.

Now is an important time for Oregon to make a commitment to building a mainstream market for EVs. As the Oregon Global Warming Commission report to the legislature showed, Oregon is falling behind on its commitments to cut GHG emissions in large part because of increasing transportation sector emissions.

Thanks to Oregon’s Zero Emission Vehicle program, manufacturers are required to sell EVs in Oregon.  This policy that helps ensure EVs are available should be matched with policies that help induce demand. A consumer incentive is the single biggest act Oregon can take to enable more drivers to choose an electric car, so passing HB 2704 is an important step forward for EVs in the state.

*We calculated prices using the following electricity and gasoline price data from the US Energy Information Agency: 2016 average residential electricity price $0.107/kWh, 2016 average gasoline price $2.40/gallon. Costs assume 11,350 annual miles driven, 28.6 MPG gasoline efficiency, 0.30 kWh/mile EV efficiency.

Click here to find more information on How Oregon Can Benefit From Electric Vehicles (2015).  


Climate Adaptation, Adaptive Climate Justice, and People with Disabilities.

UCS Blog - The Equation (text only) -

Climate activism tends to frame climate change as a problem to be solved by fighting against it, raising calls to reduce emissions in order to minimize or avoid the consequences of climate change (“climate mitigation”).

Cutting emissions is certainly important, as lower emissions will lead to smaller temperature increases, with less intense climate impacts along with it, compared to a high-emissions scenario. But in contrast to this rhetoric, the truth is that we cannot stop climate change outright and avoid its consequences – and even with our best efforts, the world will still be a vastly different place.

A 2016 report analyzed the climate impacts of a 1.5°C world and the impacts in a 2.0°C world, and both showed marked increases in frequency and intensity of extreme weather events (including storms and heat waves), variations in crop yields, rising sea levels and expanding coral bleaching. 2.0°C was shown to be a drastically different world indeed.

Other experts are predicting us passing the global target to a much scarier future. A recent paper projects that 2.5°C may be possible with extreme measures in a realistic political climate, but even that would prove difficult.  Given this inevitability and its intensity, reducing carbon emissions is not enough – we should prepare for what’s on the way.

When we prepare, we need to keep everybody in mind, and especially the most vulnerable groups with the least resources and ability to adapt on their own. This is what I like to call “adaptive climate justice,” which also creates a level playing field in the face of historical inequalities and oppressions.

One of the most vulnerable populations with the fewest resources, and the most unique needs, are people with disabilities. The reasons are vast. Among other things, evacuating storms and other emergencies requires focused planning. Heatwaves especially affect people with fragile health, climate migration is phenomenally difficult for this group, and people with disabilities are often the first to be abandoned under the age-old triage mentality.

In response to all these issues, adaptation needs to include the concerns of people with disabilities and value us as individuals to protect. This adaptive climate justice must happen with full force, and start happening now.

Climate adaptation: How the science community can help (in addition to more research)

Climate change is guaranteed to progress in the years to come regardless of our level of emissions, and each degree of warming will lead to yet stronger consequences.

As storms become stronger, oceans rise, and droughts intensify, we must adapt our societies and way of life to match our new environment. This may mean improving our disaster response and reinforcing our infrastructure for stronger storms, or transforming our water management systems to better handle drought.

The International Organization on Migration notes a widely-cited estimate of 200 million “environmental migrants by 2050, moving either within their countries or across borders, on a permanent or temporary basis,” so in some cases, we may even need to explore managed & proactive relocation – or at least developing systems that are able to handle domestic and even international migration.

Unfortunately, climate activists are already encountering barriers to change system-wide and plenty of resistance from those in power, even when it comes to switching to renewables. Pushing for these new efforts at adaptation  will be no small task and may challenge our capability for change, but it is vital to protect lives and well-being.

How can the scientific community help in these efforts? We need to be more forthright about what the science points to:  climate change is going to get worse, and we need to get ready. It ultimately does a disservice to the public to tell them that things will stay stable so long as we install solar panels and wind farms and switch to electric cars.

People are liable to be caught off-guard, while governments and other actors are less likely to prepare for the one-time events (i.e. stronger storms) and gradual transformations (i.e. sea level rise or migration) coming our way.

If we drive home the need to adapt – and use our public legitimacy to do so – then stakeholders are much more likely to get the ball rolling.

Our community must use our expertise to find the best actions to adapt to climate change.  A large amount of energy is already spent on understanding climate consequences and developing renewable technologies, yet a much smaller resource is dedicated to developing plans for adaptation. The resources devoted to both clearly need to grow to a massive extent – but it’s vital to use a larger portion of our energy to develop plans, working with stakeholders to implement them widely.

Adaptive climate justice and people with disabilities

As the climate changes, it is increasingly clear that certain populations are affected more significantly than others – and this will continue into the future. Groups most affected include women, children, people of color, people with disabilities, lower-income communities, and those in areas with especially distinct and significant climate exposure (i.e. low-lying island states or already-arid regions).

People in developing countries are also an incredibly important group on the international front, as they are often the hardest hit with very few resources to adapt (and historically put out the fewest emissions) – and it is a moral imperative to support those nations at a global scale.

Activists are already raising their voices in a call for “climate justice” that protects these populations from climate change itself. However, even this largely focuses on mitigation. “These populations are getting hit hardest by climate change,” the rhetoric goes, “so we must prevent warming to protect their well-being. Stopping global warming is a matter of climate justice.”

Mitigating climate change will certainly help these many oppressed and vulnerable groups, and is arguably the most essential first line of defense. Yet there is another piece of climate justice that we must include moving forward.

True climate justice also needs to provide the resources needed to adapt and create equity for all – and transform our systems to do the same. This adaptive climate justice must identify vulnerable groups, determine where their vulnerability lies, and ensure that the international community provides resources and other changes to tackle those needs head on. What might this mean? At the broad scale, wealthy countries can provide resources and assistance to aid developing countries, whether it is through disaster relief funding or technology for drought-resilient crops. (The State Department and USAID have participated in climate-related supports, but these are under threat to proposed budget cuts at the federal level).

Domestic justice can do the same at many levels: for example, ensuring access to air-conditioning for people in poor quality housing or those who can’t afford higher electric bills, including the needs and voices of marginalized communities in disaster readiness and response (DRR) at the federal (i.e. FEMA) and State/local levels, and even reinforcing government services for the potential turmoil of climate stress.

At the World Institute on Disability, our “New Earth Disability” project is addressing adaptive climate justice for people with disabilities (PWD). The diverse disability population includes those with physical disabilities, sensory disabilities (i.e. low vision or hearing), developmental disabilities, psychological disabilities, chronic health conditions and more. PWDs are present in every other population group at a rate of approximately 15%-meaning that our community is spread worldwide and amongst income levels, gender, race, nationality, etc.

People with disabilities are especially vulnerable to the effects of climate change because of health factors, personal and medical needs, and already-existing marginalization. Among other things our community may be isolated during disasters and unable to maintain healthcare and personal supports, have poor health when encountering heat waves or similar effects, and experience disproportionate poverty with reduced ability to manage resource stresses or adapt overall.

Climate-related migration is an especially large issue: people with disabilities are liable to lack access to accessible transportation, become disconnected from personal support networks, lose vital government and healthcare services, or simply be turned away at borders because of their disability status. It’s our job to learn more about these many problems and tackle them head-on.

The many solutions will include resilient government and healthcare services, disability-inclusive DRR, and even managing migration through accessible transit, housing and more.

Switching to an adaptive climate justice mindset and beginning those preparations will require collaboration, focused planning, effective resources, and wide-scale public education (especially for the disability and climate change communities). As a part of the New Earth Disability initiative, we encourage other stakeholders to tackle this challenge and join us in our efforts.

Alex Ghenis is a disability and climate change activist in Berkeley, California. He is the lead project manager for the New Earth Disability project at the World Institute on Disability (WID), which addresses how people with disabilities will be affected by climate change and necessary actions to adapt, and his other work at WID addresses financial literacy and employment policies for people with disabilities. Outside of WID, Alex is a regular contributor to New Mobility magazine, an occasional actor, and a talented slam poet. You can find him on Twitter at @aghenis.

For more details and general thoughts on the connections between climate change and disability, please visit the WID webpage at WID welcomes opportunities for partnership, outreach, and any projects regarding research or full initiatives. If you are interested in connecting, please email Alex Ghenis at



Worker Memorial Day: 13 Workplace Fatalities Occur Every Day in the United States

UCS Blog - The Equation (text only) -

Amidst the groundswell of information, energy, and genuine excitement around last week’s March for Science and the upcoming People’s Climate March, there’s another global and national event that should not get lost in the mix.

This one relates to workers—you know, those dedicated and hardy souls that are the backbone of what makes America great.

Workers, the engine of our economy, are also our partners, children, relatives, co-workers, friends, and neighbors. And too many of them get more than a paycheck for their efforts (recognizing that the paycheck they do get may be less than a living wage). Some of our nation’s workers die, get sick, injured, or become disabled because of workplace exposures, hazards, and conditions. And these workplace incidents are largely (very largely) preventable.

April 28— Workers Memorial Day. The day each year that we recognize, commemorate, and honor these workers. It is also a day to renew the fight for safe workplaces. It’s more important now than ever.

Data give us a quantitative look into these preventable events, but they don’t begin to capture the horror and loss they entail. Just imagine having to deal with the knowledge that a loved one was suffocated in a trench collapse; asphyxiated by chemical fumes; shot during a workplace robbery; seriously injured while working with a violent patient or client; killed or injured from a fall or a scaffolding collapse; or living with an amputation caused by an unguarded machine.

Or the heartache of watching a loved one who literally can’t catch a breath because of work-related respiratory disease. Or is incapacitated by a serious musculoskeletal injury. Or has contracted hepatitis B or HIV because of exposure to a blood-borne pathogen at work.

I could go on—but you get the picture.

What the latest data tell us:

Workplace injury fatalities: In 2015, 4,836 U.S. workers died from work-related injuries, the highest number since 2008.

That’s about 13 people every day! In the United States!

The number of immigrant workers killed on the job reached a nearly 10-year high, with Latino workers having an 18% higher fatality rate than the national average. Work in construction, transportation, and agriculture continues to be the most dangerous, and workplace violence remains a growing problem for workers. Older workers are also at high risk, with those 65 or older 2.5 times more likely to die on the job.

Deaths from work-related illness: Deaths from work-related occupational disease (like silicosis, coal workers’ pneumoconiosis (Black Lung), occupational cancer, etc) are not well-captured in data surveillance systems. It is estimated that another 50,000-60,000 died from occupational diseases—an astounding number. And, for many, their deaths come years after suffering debilitating and painful symptoms.

Non-fatal cases: And then there are the nonfatal injuries and illnesses. Employers reported approximately 2.9 million of them in private industry workers in 2015; another 752,600 injury and illness cases were reported among the approximately 18.4 million state and local government workers (1).

In addition to the physical and emotional toll these preventable incidents take on workers and their families, they exact an enormous economic cost. The societal cost of work-related fatalities, injuries, and illnesses was estimated at $250 billion in 2007 based on medical costs and productivity losses (2). I suspect these costs are well higher today (1). They are certainly not less.

Glass half empty, half full

Despite these data, it’s important to note that workplace health and safety in the U.S. is a LOT better than it used to be, due in large measure to the struggles of labor unions and working people, along with the efforts of federal and state agencies. Workplace fatalities and injuries have declined significantly, and exposures to toxic chemicals have been reduced.

This progress is a testament to the effectiveness of health and safety regulations and science-based research.

We can thank the Occupational Safety and Health Administration (OSHA), the Mine Safety and Health Administration (MSHA), and the National Institute for Occupational Safety and Health (NIOSH) for many of these protections and safeguards. We must also acknowledge and thank the persistence, energy, and efforts of the workers, unions, researchers, and advocates that have pushed these agencies along the way.

[OSHA and NIOSH were established by Congress in 1970; the Mine Safety and Health Administration (MSHA) was establishes in 1977 (although the first federal mine safety stature was passed in 1891). OSHA’s statutory mandate is “to assure safe and healthful working conditions for working men and women by setting and enforcing standards and by providing training, outreach, education and assistance.” NIOSH was established as a research agency focused on the study of worker safety and health to empower employers and workers to create safe and healthy workplaces. MSHA works to prevent death, illness, and injury from mining and to promote safe and healthful workplaces for U.S. miners.]

Despite this progress, there’s much more to be done. It’s simply not acceptable that each and every day in this country, on average 13 workers die on the job as a result of workplace injuries.

It’s not acceptable that each year, many thousands more die from occupational diseases. And that millions sustain non-life threatening injuries and illnesses that impact their daily lives.

The human, social, and economic toll is simply far too high. [Remember: this is all largely preventable.]

With the current anti-regulatory fervor and anti-science fervor in our nation’s capital, the protections, safeguards, science, and research of these agencies are at great risk. We have already seen the Trump Administration and congressional actions roll back some worker protections and block or delay others. Agency budgets and programs are on the chopping block.

Now, more than ever, we must be vigilant and vocal in our support for worker health and safety safeguards and protections.

So, as we get ready to head out to work today and tomorrow , let’s pause for a minute on Worker Memorial Day to remember those workers who have lost so much because of their jobs— along with those who continue to produce the goods and services we all enjoy, depend on, and often take for granted. And let’s get ready to use our voice, votes, and collective power to demand and defend rules, standards, policies, and science-based safeguards that protect this most precious national resource—our working men and women.

UCS will keep you posted on how and when you can weigh in. Let’s do this.

  1. AFL-CIO. Death on the Job: The Toll of Neglect, 2017. Available at
  2. Leigh JP. Economic burden of occupational injury and illness in the United States. Millbank Q 2011;89:728–72.

A Peer Review of the March For Science

UCS Blog - The Equation (text only) -

This past weekend, the March For Science drew hundreds of thousands of scientists and science supporters onto the streets in 600 locations on six continents. It was, by most accounts (including those of science historians), an unprecedented event. But big-picture speaking, how did it do?

“Pluses and deltas” is a popular retrospective exercise amongst grassroots organizers, and offers a constructive way to answer this question. It asks: where did we go right (pluses), and how can we improve (deltas)? Here are my top two pluses and deltas for the March For Science.

Plus #1: The March For Science mobilized the immobilizable. As a scientist-activist who has been organizing scientist-led campaigns and rallies under hashtags like #StandWithScience for several years, I know first hand how hard it can be to get introverted, politically-ambivalent scientists worked up—let alone out of their labs and into the streets. In that sense, last Saturday was incredible. “The march represented a sort of coming-out party for many scientists flexing a fledgling political muscle,” Vox’s Brian Resnick observed. In D.C. he met Charlotte Froese Fischer, an 87-year-old atomic physicist who “until today…had never attended a political rally of any kind, let alone one for science.” At March For Science rallies at Harvard and MIT, I was delighted to see not just the usual suspects, but hundreds of mildly uncomfortable academics who had clearly never waved a sign or chanted in public before (I’ve been there). For many, this was a gateway into political engagement and activism.

Plus #2: The March For Science forced many scientists—not to mention the public, press, and politicians—to grapple with the role of science in society and the relationship between science and politics. As scientists with little or no past experience in political engagement wrestled for the first time with the fear of politicizing science, the march went from officially apolitical to political-but-non-partisan. As my advisor, Harvard Professor Naomi Oreskes, points out, research indicates that this fear is just that—a fear, unsubstantiated by historical evidence and peer-reviewed experiments, which show that scientists’ credibility is robust to science-advocacy. Indeed, scientists appear to have largely brought this fear upon themselves by conflating the idea of science in the abstract (the scientific method) with the application of science in the real world. In so doing, we handed journalists an irresistible ‘controversy’ over (mostly) semantics. And yet, with time, the march’s communications improved, and on the day, its global message was unambiguous: science serves society.

The march’s successes have helped normalize science-activism, injecting momentum and political potential into this new “science voter” bloc. Capitalizing on this momentum, however, will take work. For me, the deltas of the March For Science involve better embracing the sociopolitical realities in which science operates.

Delta #1: Having fumbled with the largely mythical fear of politicizing science, scientists must now truly move on if we are to become more effective campaigners and messengers. This means not just rallying in the abstract about the importance of science (“I love science!”), but speaking out on specific issues where science is being trampled on by politicians and policymakers. Climate change epitomizes this. At its best, the March For Science offered a profound statement of our values as scientists, which is a crucial start. But a truly effective narrative for social change also requires a story of “now”: a moment of crisis that challenges those values. By not explicitly articulating President Trump’s war on science (and, accordingly, on all of us) as one of the targets of our protest, the March left room for improvement.

Delta #2: Scientist-activists must embrace the intersectionality of science with politics, race, class, gender, corporatism, and so on. Here, I am referring not to diversity within academia, as exceptionally important and related as it is, but to how the science movement (comprising both scientists and science lovers) sees its place in the world. Unlike the scientific method, the science movement does not—and should never—exist in a bubble. We should embrace opportunities to connect science to real-world issues, both in what we say and who we collaborate with.

In my own field of energy and climate change, for example, we should talk about how last year alone, the solar industry hired more people than the coal industry employs in its entirety. We should talk about how fossil fuel pollution and climate change disproportionately harm and kill minorities and indigenous groups. In short, we should stand in solidarity with those whom our science strives to protect. Not only is this the right thing to do, it is politically effective; by building narratives of shared values, we can broaden our coalition and win the political story wars. The movement for a just and stable low-carbon future doesn’t stop at the laboratory’s edge, but for too many scientists, it still does.

At Saturday’s march, amidst the geeky signs and nerdy chants, Reverend Lennox Yearwood Jr.—a leading figure in the climate movement and a VIP guest of the March For Science—was, he reports, a victim in broad daylight of a racist assault by D.C. police officers. “The deeply disappointing truth of this Earth Day case of racial profiling,” Yearwood observes, “was that none of my fellow science marchers stopped or took issue with what was happening. They didn’t question or pause to witness in a way that one would for a member of one’s community.” Of course, the inactions of those present do not represent all marchers or scientists. But in that random sampling—at that moment on that crosswalk—solidarity was absent.

This coming Saturday, April 29, the People’s Climate March offers an immediate opportunity for scientists and science supporters alike to build on the pluses of the March For Science, and to work on our collective deltas. In DC and 250 sister marches nationwide, hundreds of thousands of us will stand up for climate, jobs, and justice. It is an important first test. Can we find the moral courage to not only celebrate values like evidence-based policy, but put them into action on real-world issues like climate change? Are we willing to step out of our comfort zones to call out the Trump administration’s anti-science pandering to fossil fuel interests? Is this a moment, or a movement?


Dr. Geoffrey Supran is a post-doctoral researcher in the Institute for Data, Systems, and Society at MIT and in the Department of History of Science at Harvard University. He has a PhD in Materials Science & Engineering from MIT. He has co-led several campaigns to mobilize scientists to engage in climate advocacy, including the fossil fuel divestment campaign at MIT, an open letter from academics urging Donald Trump to take climate action, and the #StandUpForScience rallies in San Francisco and Boston, which were the first major scientist protests against the Trump administration. He spoke at the Harvard March For Science.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

President Trump’s Assault on the Antiquities Act Signals Trouble for National Parks and Monuments

UCS Blog - The Equation (text only) -

Without the Antiquities Act, now under attack by the Trump administration as part of its strategy to roll-back environmental protections and open public lands to increased exploitation for coal, oil and minerals, we might never have had the benefit of the Grand Canyon, Olympic or Acadia national parks.

The Antiquities Act of 1906 gives the president of the United States the power to designate lands and waters for permanent protection. Almost every president since Teddy Roosevelt has used the Act to place extraordinary archaeological, historic and natural sites under protection and out of reach of commercial exploitation.

Many sites originally designated as national monuments were later upgraded by Congress to become national parks, including Bryce Canyon, Saguaro and Death Valley. In many cases in the past, the Antiquities Act allowed presidents to protect vital natural and cultural resources when congressional leaders, often compromised by their ties to special interests representing coal, oil, timber and mining industries, were reluctant or unwilling to act.

A new Executive Order signed by President Trump on April 26th, 2017 puts this important regulatory tool for conservation and historic preservation at risk. The clear intention of the Executive Order is to lay the groundwork for shrinking national monuments or rescinding their designation entirely, in order to open currently protected public lands for untrammeled growth in coal, oil and minerals extraction.

A clear intention to open public lands for coal and oil exploitation

The Executive Order requires the Secretary of the Interior to review all presidential designations since 1996 of national monuments over 100,000 acres in size. However, in the short-term it appears particularly aimed at reversing designations or reducing the size of Grand Staircase-Escalante and Bears Ears national monuments, which together comprise 3.23 million acres in Utah.

An attack on the Antiquities Act is an attack on all monuments and has huge implications for future presidents’ ability to protect important sites in the future.

President Trump announces the review of national monument designations, while Secretary Zinke and the ghost of Teddy Roosevelt look on. Photo: Department of the Interior

Remarkably, in its own press statement, the Department of the Interior (the federal agency responsible for managing and protecting our public lands) tips its hand and signals that it has no intention of undertaking a fair and independent review by describing Grand Staircase-Escalante and Bears Ears as the “bookends of modern Antiquities Act overreach”.

Secretary Zinke himself was quoted ridiculing “people in D.C. who have never been to an area, never grazed the land, fished the river, driven the trails, or looked locals in the eye, who are making the decisions and they have zero accountability to the impacted communities.”

But, in fact, national monument designations almost always derive from a local grassroots demand for greater protections, and usually only come after lengthy periods of community engagement and consultations.

A vital conservation tool in a changing environment

The Antiquities Act itself grew from years of pressure from archaeologists and those who were concerned about looting and damage to Ancestral Pueblo and other tribal sites in the Southwest. Over the years, its use has expanded to include natural sites on land and large marine ecosystems.

Presidents G. W. Bush and Barack Obama, for example, both designated important ocean areas as national monuments to safeguard marine productivity, fish spawning areas and fragile ecology. When President Obama announced the designation of Papahanaumokuakea Marine National Monument, he drew particular attention to the threat posed to this almost pristine area by climate change.

Conditions have changed since 1906. US population has more than tripled since then, urban and suburban growth has increased markedly and many of the archaeological and cultural sites that were once under threat mainly from looting and natural resource exploitation are now also vulnerable to climate change. UCS documented the climate threat in its 2014 report Landmarks at Risk and its 2016 report on climate threats to World Heritage sites, published with UNESCO and UNEP.

In his speech designating Papahanaumokuakea Marine National Monument, President Obama cited the threat of climate change. Photo: James Watt/DOI/SeaPics

Tribal cultural resources under attack again

Ironically the attacks on tribal heritage that were behind the signing of the Antiquities Act in 1906 have come full circle with this new assault by the Trump administration more than a century later.

Five sovereign Tribes, all with ancestral ties to Bears Ears, including the Hopi and the Navajo Nation have formed the Bears Ears Inter-Tribal Coalition to protect the monument. Bears Ears contains thousands of sacred and culturally important sites and many Native Americans continue to perform ceremonies and gather medicinal plants there. Bears Ears also contains thousands of archaeological sites, including, for example, the Lime Ridge Clovis site, providing evidence of occupation going back 11,000-13,000 years or longer.

Tourists who visit Bears Ears and other national park units in the Southwest, including World Heritage sites such as Mesa Verde and Chaco Canyon, are drawn not just to the incredible landscapes, but also to the extraordinary and cliff houses, pit houses, pictographs and other Ancestral Pueblo remains.

Monuments provide local economic benefits

Tourism is an important economic driver around national parks and monuments. The National Park Service generated $34.9 in 2016 and supported 318,000 jobs. Bears Ears alone attracts more than 900,000 visitors annually, providing a very significant boost to the local communities.

A 2014 study of 17 national monuments by Headwaters Economics found that the local economies all expanded following the monument designation. Secretary Zinke seems to think that local communities are unhappy with national monuments, but a 2016 Colorado College poll showed that fully 80% of westerners oppose removing existing monument designations.

No president has ever tried to revoke a predecessor’s monument designation before, and if that is the direction this Administration is going in, we owe it to future generations to fight this action. The national monuments and parks of the United States tell the story of who we are and where we came from. They represent the diverse stories of Americans and help define us as a nation. An attack on national monuments is an attack on us all, and the histories we share.


The Regulatory Accountability Act Subverts Science and Must Be Stopped

UCS Blog - The Equation (text only) -

Today, just four days after hundreds of thousands of people marched for science, the Senate introduced a bill that would substitute politics for scientific judgment in every decision the government makes about public health and the environment. If enacted, the legislation would cripple the government’s ability to effectively carry out laws that protect us, putting everyone at more risk, especially communities of color and low-income communities that are more exposed to threats.

The Regulatory Accountability Act would prevent scientists at EPA, OSHA, and many other agencies from protecting public health and the environment. It must be stopped.

The ill-named Regulatory Accountability Act (House version here with coverage) does nothing more than stack the deck in favor of private companies at the public’s expense. It would paralyze agencies like the Environmental Protection Agency and the Occupational Safety and Health Administration, drowning them in red tape and compromising their public service missions. It is way more dangerous than other legislation that grabs headlines (such as the bill to eliminate the EPA) because, in this political environment, it actually has a chance.

Senators who support this legislation will be turning their backs on the role of science in making all kinds of decisions. Safety standards for the food we eat. Rules that protect construction workers on job sites. Limits on work hours of pilots and air traffic controllers. Protections for children from toys laden with harmful chemicals.

“This bill is a weapon aimed right at public health and safety protections,” said UCS’s Andrew Rosenberg in a statement. “This bill doesn’t support accountability—it removes accountability from the industries subject to regulation.”

The Regulatory Accountability Act is a bad idea. It is also not a new idea. The legislation was introduced in the last Congress, and the Congress before that. In 2015, my former colleague Celia Wexler called the Regulatory Accountability Act a “zombie bill,” legislation that has failed repeatedly in the past but keeps getting resurrected. She continued:

This bill is deliberately complicated. You have to be a regulatory lawyer to perceive all the traps, and even then you might miss some. Essentially what the RAA would do is hamstring federal agencies with additional procedural burdens when they try to carry out their mandates using the best available science.

The latest iteration of this legislation is no different. But now, it is more likely to pass and be signed into law by an administration that is committed to the “deconstruction of the administrative state” and the rolling back of public health, consumer, and environmental protections.

The Regulatory Accountability Act eviscerates the role of science in policymaking. Science, not politics, should guide decisions about public health and the environment.

We do not live in a world where you can give people forty acres of farmland, wish them luck, and send them on their way. Our world is incredibly and increasingly complex. We must rely on experts to set standards that give us equal ability to pursue our dreams.

Freedom includes protection from products and environmental contaminants that can cause us harm. Freedom includes requiring companies to pay for the pollution they create without passing the burden on to the taxpayer. We empower federal agencies to make decisions based on independent analysis and not political dealmaking precisely because it gives us these freedoms.

The Regulatory Accountability Act was rushed through the House of Representatives before the dozens of newly-elected representatives had hired staff to even read it. The Senate version offers little improvement. The current political reality requires us to fight like hell to defeat legislation that would normally be laughed out of Congress.

But fight like hell we must. A committee hearing is expected soon. So today, and every day, call both of your senators and tell them that the Regulatory Accountability Act is bad for all of us.


Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs