Combined UCS Blogs

7 Fun Facts for National Farmers Market Week

UCS Blog - The Equation (text only) -

Customers shop at the Crossroads Farmers Market in Takoma Park, Maryland, July 2014. Photo by Union of Concerned Scientists

And now, something we can feel good about. This Sunday marks the start of National Farmers Market Week, an annual celebration of local food systems. To get us in the mood, here are six facts that illustrate the benefits of farmers markets and local food systems.

FACT #1: There are 8,690 farmers markets nationwide. This may actually be a low-ball count, but it’s the number of markets currently listed in the US Department of Agriculture’s (USDA’s) National Farmers Market Directory. Washington, DC, where I live, is particularly fertile ground for farmers markets—the interactive database lists more than 60 markets within five miles of my home (try it for your state or ZIP code). But farmers markets have become commonplace across most of the country, as illustrated by this rather crowded national map generated from the USDA’s data:

FACT #2: In 2015, more than 167,000 US farms sold $8.7 billion worth of food directly to consumers, retailers, institutions (such as hospitals and schools), and local distributors. This was the finding of a farmer survey published by the USDA last year. The survey further found that more than one-third of those sales ($3 billion) were made directly to consumers via farmers markets, CSAs, farm stands, and the like.

FACT #3: Participants in the federal Supplemental Nutrition Assistance Program (SNAP) redeemed more than $20 million in benefits buying food from local farmers in FY 2016. That’s up a staggering 638 percent from 2008. The data from the USDA’s Food and Nutrition Service, which tracks purchases made with SNAP benefits (formerly known as food stamps), shows that nearly 7,000 farmers markets and individual farmers across the country are authorized to accept these benefits. And the USDA’s Food Insecurity Nutrition Incentive (FINI) grant program, established by Congress in the 2014 farm bill, is helping to increase purchases of fruits and vegetables among SNAP participants by subsidizing these purchases at farmers markets and other outlets.

FACT #4: Three out of four farmers who sell at farmers markets use practices that meet or exceed organic standards. That was the finding of a 2015 survey by the non-profit Farmers Market Coalition and American Farmland Trust. More details from the survey: Nearly half of farmers used integrated pest management, information on the life cycle of pests, and their interaction with the environment to manage and prevent crop damage. And the overwhelming majority (81 percent) incorporated cover crops, reduced tillage, on-site composting, and other soil health practices into their operations. (Read more about the importance of soil health here.)

FACT #5: Farms selling fruits and vegetables locally employ 13 full-time workers per $1 million in revenue earned, for a total of 61,000 jobs in 2008. A report by the USDA’s Economic Research Service compared these farms with fruit and vegetable growers not engaged in local food sales, and found the latter employed just three full-time workers per $1 million in revenue.

FACT #6: Farmers themselves benefit economically from farmers markets, pocketing upwards of 90 cents for each dollar of sales there. So says the Farmers Market Coalition. And how does that compare to the return for US farmers overall? The National Farmers Union estimates that farmers’ share of every dollar Americans spend on food in 2017 is a paltry 15.6 cents.

FACT #7: Sales increased by more than one-quarter at farmers markets participating in the USDA’s Farmers Market Promotion Program (FMPP) between 2006 and 2011. Established by Congress in the 2002 farm bill, the FMPP is a competitive grant program designed to increase access to locally and regionally produced foods and develop new market opportunities for farmers. To measure this program’s impacts, researchers in 2012 surveyed organizations awarded grants during the previous six years. In addition to a 27 percent sales increase, the survey also showed that customer counts increased by 47 percent at markets that received FMPP grants, and the number of first-time customers increased at nearly all (94 percent) of these markets.

BONUS “FACT”: Everyone loves a farmers market. Everyone. Even this guy (who is normally not a big fan of facts), proving that farmers markets really do bring people together.

But seriously, the benefits of local food systems—for farmers, consumers, and communities—are worth investing in. Secretary of Agriculture Sonny Perdue recognized those benefits with his official proclamation of the week. But now it’s up to all of us to ensure that the secretary and Congress make those investments. That’s why UCS is advocating for USDA programs (including SNAP, FMPP, and FINI, among others) that are helping to ensure that farmers markets and local food systems thrive.

Happy National Farmers Market Week!

On Healing Sick Ecosystems

UCS Blog - The Equation (text only) -

Part of the Lehigh Gap Nature Center site before remediation, October 2002. Photo credits:

I am a person who is fascinated by organisms of all kinds. I like the cute fuzzy ones that most people like, but also the scaly, leafy, prickly, stinky, or slimy ones, as well as the ones we can’t see without a microscope but that have outsized effects on the world around them. I am amazed by how many different ways there are to be alive on this planet, and moved by the intricate connections living things have with each other and their environments.

As I began to study the diversity of life, I noticed a pattern: many creatures are in danger because we humans are unintentionally destroying their homes. Whether by pollution, climate change, or clearing habitats to build things of our own, we have made much of the world less habitable for the living things with which we share it. We have already driven some species extinct, and many others are perilously close.

I believe there is a compelling moral case for preserving healthy, diverse ecosystems. There is also a strong practical case: we depend on intact ecosystems for services like clean water, fresh air, and pollinators that help our crop plants reproduce. Living near green spaces also improves our health and society as a whole. Thus, I chose a career studying how to help ecosystems best recover from our more destructive impacts. In my PhD research in Prof. Brenda Casper’s lab at the University of Pennsylvania, I studied how interactions between plants, soil-dwelling microbes, and heavy metals can affect the long-term development of ecosystems on metal contaminated soils.

One of the two zinc smelters responsible for heavy metal pollution in the Palmerton Zinc Superfund Site. Photo credit: Lee Dietterich

Pollution and remediation: one site’s story

I conducted my studies in the portion of the Palmerton Zinc Superfund Site owned and managed by the Lehigh Gap Nature Center. The site consists of over 2000 acres on the side of a mountain in upstate Pennsylvania that was devastated by heavy metal pollution from two zinc smelters operating for much of the 20th century. When the site was at its worst, local residents and passersby on the Appalachian Trail, which traverses the site, frequently compared it to the surface of the moon, or the aftermath of a bomb explosion.

The site badly needed some kind of remediation to remove or contain the pollutants and mitigate their threat to human and environmental health. It was (and still is) crucial that remediation be guided by our best scientific understanding of site histories and the effects of heavy metals on humans and the environment. Interference in the form of censoring data about such sites, or letting corporate or political priorities dominate discussions about environmental stewardship, can only make remediation longer and more difficult.

Part of the Lehigh Gap Nature Center site before and after initial remediation (left, October 2002; right, August 2006). Photos:

Today, after over a decade of intensive remediation work involving scientists, community members, and numerous federal, state, and private organizations, the mountainside would be unrecognizable from the moonscape described above. Grass species with low metal uptake were planted to build healthy soil while keeping the metals sequestered underground. These grasses, now taller than most people, tower and sway in the breeze. In many places shrubs and small trees are coming in, and in the patches of forest that survived the pollution, dense canopies create cool shade over lush carpets of ferns. Birds, grasshoppers, and butterflies are diverse and abundant, and it is not uncommon to encounter deer at dawn or dusk. Hundreds of hikers and thousands of schoolchildren visit the area each year, largely thanks to land management and educational offerings by the Lehigh Gap Nature Center, which now owns about a third of the site.

Sustained collaboration between scientists, land managers, and community members has been essential to this remediation effort. Early in the process, researchers made valuable contributions by documenting effects of the polluted soils on the site’s plants, animals, and microbes and by testing numerous revegetation strategies. Remediation of a polluted site had not been attempted on such a large scale before, and this early testing was key to the successful establishment of large-scale plantings.

Wildlife returning to the Lehigh Gap Nature Center site. Photo credit: Lee Dietterich

Continuing challenges

Remediation of disturbed landscapes is an ongoing task, and both basic and applied scientific research are crucial to understanding how to do this task well.  Many fundamental questions remained when I began working in the site. For instance, we knew that a group of soil dwelling fungi called arbuscular mycorrhizal fungi (AMF; soil dwelling fungi that trade plants nutrients for sugars) were important for the growth of many plants there, but we had little idea how AMF might affect plant metal uptake or metal tolerance under field conditions. After a couple years of work at the site, in the lab, and on the computer, I found that mycorrhizal fungi have little effect plant metal uptake, but that there is a remarkably close relationship between a plant’s species identity and the chemistry of the soil underneath it. This suggests that once plants are growing in an area, adding AMF will have little effect on their metal uptake. However, knowing what plants are growing in a certain patch of soil can tell us a lot about that soil’s chemistry.

The researchers and managers of the Palmerton site also feared that an uninvited tree species, gray birch, accumulated such high leaf metal concentrations that its leaf litter would elevate metals at the soil surface and poison neighboring plants, including the grasses they had worked so hard to establish. This pollution of soil via leaf litter has been hypothesized to occur but it has not yet been thoroughly tested, and the Palmerton site seemed like an ideal setting for such a test. Again, I investigated, and after a couple years of study, including planting, monitoring, harvesting, and analyzing nearly 500 oak and maple seedlings in the site, my colleagues and I found that metal-contaminated birch leaf litter does not increase surface soil contamination or poison other plants, but that soils under the birches and grasses differ in their concentrations of metals and organic matter in ways that could shape the continued trajectory of plant community development in the site.

Science-based decision making helps us reclaim and remediate ecosystems. Photo credit: Lee Dietterich

How lessons learned from remediation help us rebuild ecosystems better

These findings are already shaping the course of continued remediation and broadening our more general understanding of how metal polluted ecosystems work. We now know that efforts to control plant metal uptake may be better served by altering soils or plant communities directly than by manipulating AMF. We also know that gray birch does not threaten remediation as was feared, though concerns remain that it may shade out the desired grasses or introduce metals into the food chain via its leaves. Furthermore, thanks to the work of dozens of other scientists in this and other contaminated sites, we are learning important information about the continued legacies of pollution, such as how metals do and do not move in groundwater, and the effects of contaminated sites on migrating birds that rest and feed there.

It is clear that conserving healthy, intact ecosystems remains preferable to disturbing them and then trying to rebuild them. As with most diseases, prevention remains far easier and cheaper than cure. However, for those landscapes we have already damaged, science is providing local residents and land managers with tools to improve their lives – and those of their invaluable fuzzy, leafy, or slimy neighbors—by reclaiming and restoring healthy ecosystems.


Lee Dietterich is an ecologist studying how interactions between plants and soils affect the movement of elements such as carbon, nutrients, and heavy metals in and through ecosystems.  He is currently a postdoctoral scholar in Prof. Daniela Cusack’s lab at UCLA.  When not doing science or exploring nature, he likes to play the piano and clarinet. 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Dead Zone 2017: Even Worse than Predicted (and That’s an Understatement)

UCS Blog - The Equation (text only) -

This map of dissolved oxygen levels in the Gulf of Mexico shows the extent of the dead zone in July 2017. Courtesy of Louisiana Universities Marine Consortium.

There’s more bad news for the Gulf of Mexico. A team led by researchers at Louisiana State University this week confirmed the largest Gulf dead zone since standardized measurement began in 1985. The lifeless area of low oxygen in the Gulf is now at least the size of New Jersey, the researchers say, noting in a press release that because they couldn’t map the entire affected area, their measurement is an understatement of the problem this year.

As I wrote when it was predicted in June, the dead zone arises each year due to a phenomenon called hypoxia—a state of low dissolved oxygen that occurs when excess pollutants, such as nitrogen and phosphorus, accumulate in bodies of water. These nutrients enter the Gulf largely as runoff from Midwestern farm fields carried by the Mississippi River and its tributaries. They feed blooms of algae that die and decompose, and the process depletes the oxygen in the surrounding water. Fish and other creatures must either flee or suffocate.

We need to reduce farm runoff…a LOT

This year’s dead zone confirmation comes on the heels of a new study last week by scientists at the University of Michigan and Louisiana State University, who looked at what would be needed to decrease the size of the annual Gulf dead zone. Note these researchers aren’t talking about eliminating the dead zone, just shrinking it down…to the size of Delaware. (Delaware! Still an entire state, and not even the smallest one.)

Accomplishing just that, they estimated, would require bold action—a 59-percent reduction in the amount of nitrogen runoff coming from farmland in the Midwestern Corn Belt. That’s a very large reduction, one that farmers almost certainly can’t achieve just through more careful applications of fertilizer. The study’s lead author, University of Michigan aquatic ecologist Don Scavia, was pretty clear about that:

“The bottom line is that we will never reach the Action Plan’s goal of 1,950 square miles until more serious actions are taken to reduce the loss of Midwest fertilizers into the Mississippi River system,” Scavia said.

Instead, the Midwest farming system—which today leaves soil bare and vulnerable to runoff for months out of every year—will need to change.

And speaking of change, our changing climate poses a further challenge to tackling dead zones and related water quality problems in the Gulf and elsewhere. Yet another new study published in Science last week linked toxic algae blooms—a problem also caused by excess nutrient runoff, in which algae by-products can make water unsafe to drink or swim in—to climate change. The authors warned that increased rainfall in future years may wash even more fertilizer into rivers and lakes (e.g., Lake Erie), worsening the problem.

Soil is the solution!

Okay, so we need dramatic reductions in current rates of fertilizer runoff in the Midwest. In recent months, UCS has documented innovative farming practices such as extended crop rotations and perennial prairie plantings that can substantially reduce fertilizer use and runoff. We’ve also shown that such practices and systems are also good for farmers’ bottom lines.

And the evidence that farming practices that keep soil covered year-round can solve multiple problems by reducing rainfall runoff keeps coming. Next week, we’ll release an exciting new report that shows how farmers can create healthier, “spongier” soils, and how Congress and the USDA can help. Stay tuned!

Three Renewable Energy Numbers to Impress Your Friends With: 7, 43, 50

UCS Blog - The Equation (text only) -

A whole new perspective. (Credit: UCS)

Next time you’re talking with a friend about the exciting things happening in our electricity sector (aren’t you always?), here are three easy numbers for remembering how we’re doing: 7, 43, and 50.  That’s: wind energy’s progress, solar energy’s growth, and the number of states making it happen.

Wind’s growth = 7

Renewables on the Rise, a new report from the Environment America Research & Policy Center and the Frontier Group, details some of the progress we’ve made in this country over the last decade, and includes handy accompanying graphics. Here’s a glimpse of what it all looks like.

Credit: Environment America/Frontier Group

Growth in renewable energy in recent years has meant we produced almost seven times as much wind-powered electricity in the US in 2016 as we did in 2007. And wind’s share of our national electricity generation increased from 0.8% to 5.5%.

All told, the tens of thousands of wind turbines dotting the landscape generate enough to cover the electricity needs of some 25 million typical American homes.

The wind action is taking place from coast to coast and particularly in plenty of places in between, from coal-has-been-king-but-here-comes-wind Wyoming to where cod rule (think offshore wind).

And, increasingly, wind is an energy option that decision makers ignore (or get wrong) at their peril.

Solar’s growth = 43

Credit: Environment America/Frontier Group

Recent gains have in some ways been even more impressive for solar. The baseline is maybe a little tough to pin down (and our own calculations suggest an even greater growth), but the new report says that we got 43 times as much electricity from solar in 2016 as in 2007.

That steep upward trajectory has taken solar from a minuscule 0.03% of US electricity generation to 1.4%. Still small, but definitely noticeable—and definitely worthy of notice, in terms of solar past and future. As my colleague Julie McNamara points out in that post, our 19.5 billion kilowatt-hours of solar generation in 2016 would have been enough to cover residential electricity needs in half the states.

And solar, like wind, isn’t resting on its laurels. Just last year, the US industry installed enough new solar capacity to provide 2 million homes’ worth of electricity.

States involved = 50

So where’s all this progress coming from? Though some are still finding their way, every state has some generation from solar and wind, and some have taken those technologies to pretty impressive heights.

For Texas, it (mostly wind) added up last year to 59 billion kilowatt-hours of electricity—enough to keep 4 billion light bulbs burning every evening of the year. In North Dakota, wind generation added up to the equivalent of 45% of the state’s electricity consumption; in Iowa, 42%. For California and Hawaii, solar, with help from wind, produced enough to have accounted for one out of every six kilowatt-hour consumed.

Sure, some states really need to get in on the action in a much bigger way (the details in the back of the new report help highlight leaders and… others). And they’d benefit in doing that by reaping all that renewables have to offer.

But even states without much yet on the generation side are contributing—and benefiting—in other ways, through manufacturing, for example, of components for solar or wind installations (see map). And that progress has meant jobs—in most cases, more solar and wind jobs than coal has to offer.

Our 50 united states are far from done. Every one of them has a lot more potential in solar, wind, and other renewables. Taking it the next step, and beyond, will be crucial.

But for a moment, acknowledging and celebrating clean energy progress is really important. For that, for the near term, just remember 7, 43, and 50.

Credit: American Wind Energy Association


NRC’s Decision Making: 18 Reasons Why You Are Right, but Wrong

UCS Blog - All Things Nuclear (text only) -

As described in a prior blog post, the Unit 3 reactor at the Palo Verde Generating Station had one of two emergency diesel generators (EDGs) explode during a test run. The license issued by the Nuclear Regulatory Commission (NRC) allowed the reactor to remain running for up to 10 days with one EDG unavailable. Fixing the heavily damaged EDG would require far longer than 10 days, so the plant’s owner submitted requests to the NRC for its permission to run the reactor for up to 21 days and then up to 62 days with only one EDG available.

As described in a followup blog post, NRC staffer(s) filed formal opposition to the agency’s approval of the owner’s requests by initiating Differing Professional Opinions (DPOs). Under the NRC’s DPO process, a DPO panel is formed to review the issue and to document its findings and conclusions in a report to the NRC senior manager who makes the final decisions. In this matter, that individual was the Director of the Office of Nuclear Reactor Regulation (NRR). The DPO originator(s) can nominate one individual to serve on the DPO panel. (The DPO process requires a minimum of three persons on the DPO panel, ensuring that the panel won’t have a majority of members sympathetic to the originator(s)’s concerns.)

The DPO panel issued its report on June 5, 2017, and the NRR Director issued his decision on June 28, 2017.  The NRC made the DPOs, the DPO panel report, and the NRR Director’s decision publicly available on July 21, 2017.

The DPO Originator

Troy Pruett originated both of the DPOs in the Palo Verde EDG case. Mr. Pruett is the Director of the Division of Reactor Projects in NRC Region IV. Among other things, Mr. Pruett oversees the NRC’s resident inspectors at all of the nuclear power plants operating in Region IV, including Palo Verde. Mr. Pruett has worked for the NRC for nearly a quarter century—long enough to know the agency’s regulations and procedures intended to protect nuclear plant workers and the American public inside and out (Fig. 1).

Fig. 1 (Source: Nuclear Regulatory Commission)

The DPO Originator’s Position

In his DPOs, Mr. Pruett contended that the owner’s requests to operate Palo Verde Unit 3 for up to 21 and later up to 62 days with one emergency diesel generator unavailable should not have been approved because they departed from the agency’s regulations, procedures, and practices.

The DPO Panel’s Conclusion

Quoting from their report: “The DPO Panel was not unanimous in concluding that Palo Verde License Amendments 199 and 200 should have been approved by the staff.”

One of Mr. Pruett’s candidates was appointed to the DPO Panel. NRC management selected three other members, assuring they’d have a majority. And sure enough, a majority of the NRC management-appointed panel sided with NRC management.

The DPO Panel’s 18 Observations

The DPO Panel’s report contained 18 Observations about the processes used (and not used) en route to the Palo Verde EDG approvals:

(1) The owner submitted two licensing requests to the NRC: one to operate for up to 21 days and the second to operate for up to 62 days with one EDG unavailable. The overwhelming majority of the hundreds of licensing requests submitted to the NRC each year are not bifurcated in this way and the agency’s procedures for reviewing licensing requests do not address such “split” requests. The DPO panel recommended that additional guidance be provided in LIC-101, the NRC’s procedure for handling such reviews, if the practice becomes more frequent.

(2) The DPO Panel noted that the staff’s reasoning for the two-step approach could have been made clearer in the first approval in the interest of transparently providing a complete record of the staff’s decision basis to the public.

(3) The DPO Panel observed that there may be opportunities to more effectively communicate with the public, including the use of less formal communications tools, during emergency requests, and suggests that guidance and training be considered in this area.

 (4) The second approval issued by the NRC staff contained an explicit requirement to shut down Unit 3 if workers found the cause of the EDG’s failure could also disable the surviving EDG. The DPO Panel noted that no formal regulatory commitment existed in the first approval. During interviews, the NRC staff was not able to provide a sufficient basis as to why a similar condition was not included.

(5) The DPO Panel observed that the Safety Evaluation issued by the NRC staff in support of the first approval lacked sufficient documentation to objectively identify the staff’s decision basis in several key areas, including how the potential for common cause failure of the surviving EDG was evaluated and the basis for a 21 day EDG outage time. In other words, the NRC staff failed to ask and answer all the relevant safety questions.

(6) The DPO Panel concluded that the use of a zero test and maintenance assumption (i.e., no other safety equipment would fail or be unavailable while the EDG was broken) in the owner’s probabilistic risk assessment (PRA) model was not consistent with Regulatory Guide 1.177 guidance and the regulatory commitment put in place for the second approval for the conduct of routine maintenance and surveillance was not consistent with PRA assumptions.

In other words, when the EDG was unbroken, the owner’s risk assessment assumed that there was a small, but non-zero, chance that the highly reliable emergency equipment would not perform needed safety functions during an accident. But when evaluating the risk during the 62 days the reactor would operate with a broken EDG, the owner’s risk assessment assumed that all emergency equipment would function perfectly. The DPO Panel found this assumption unrealistic, non-conservative, and contrary to longstanding NRC expectations.

(7) Additional guidance should be evaluated with respect to defense-in-depth, the adequacy of long duration equipment outage periods, and whether there should be a backstop (i.e., maximum outage period).

(8) The DPO Panel concluded that the Branch Technical Position 8-8 guidance was not strictly adhered to for the two approvals. The DPO Panel recommended that deviations from established guidance should be documented and justified. The Branch Technical Position explicitly stated that the NRC staff should not even review a request to operate for longer than 14 days with one EDG unavailable; in this case, the NRC staff not only reviewed such a request, they approved it without explaining why they dismissed the 14-day maximum duration.

(9) The DPO Panel confirmed that the NRC’s safety evaluation supporting the first approval did not include an independent verification of the owner’s risk evaluations.

(10) The DPO Panel identified that the second approval used the three risk-informed tiered review approach outlined in Regulatory Guide 1.177. The DPO Panel pointed out this approach was inconsistent with Standard Review Plan 16.1 guidance, which states that Regulatory Guide 1.177 only applied to permanent (as opposed to temporary or “one-time”) changes. In other words, the NRC staff used an approach not allowed by the agency’s procedures.

(11) The DPO Panel found no discernible differences between the DC Cook request for a 65-day EDG outage in June 2015 and the Palo Verde request. However, the staff appears to have arrived at entirely different conclusions, based upon different interpretations of the deterministic guidance of Branch Technical Position 8-8. Cook’s owner sought the NRC’s permission to operate Unit 1 for up to 65 days with one of two EDGs unavailable. The NRC said no to Cook’s owner and yes to Palo Verde’s owner, citing Branch Technical Position 8-8 for each of the entirely opposite decisions.

(12) In the DPO Panel’s opinion, the Palo Verde risk evaluation warranted closer scrutiny. However, interviews of the NRC staff identified that there is no guidance for when to use the agency’s SPAR models for independent verification and it appears to be at the discretion of the reviewer(s).

(13) Section 4.2 of the NRC’s procedure for reviewing licensing requests, LIC-101, states that, “Decisions to not apply specific precedents, especially precedents cited by a licensee, should be clearly explained in the SE [NRC’s Safety Evaluation] (to avoid the appearance of being arbitrary and/or inconsistent).” The DPO Panel observed that neither of the Safety Evaluations prepared by the NRC staff for the two approvals addressed the licensee’s referenced precedents. In other words, the NRC staff did not follow the procedure they purportedly used to make the approvals.

(14) The DPO Panel found that both of the Safety Evaluations prepared by the NRC staff for its approvals included Branch Technical Position 8.8 in the list of regulatory guidance documents reviewed. The Safety Evaluations stated that Branch Technical Position 8.8 required more defense-in-depth for station blackout scenarios than for loss of coolant accident scenarios because of a higher likelihood of occurrence. But the DPO Panel found no such statement or implication about design basis accident likelihoods in the Branch Technical Position. In other words, the NRC staff departed from the regulatory guidance document it purportedly used to justify the approvals.

(15) The DPO Panel determined that there is no established guidance for how NRC staff should judge the adequacy of risk evaluations provided by plant owners. The good news is that the NRC staff cannot depart from non-existent guidance; the bad news is that the NRC staff can, and has, wandered all over the map since it lacks proper directions.

(16) The DPO Panel identified a lack of clarity in the existing review guidance and related inconsistencies in the understanding between the NRC departments regarding who is responsible for reviewing what in licensing requests.

(17) The DPO Panel recommended additional guidance be developed for the NRC staff when reviewing requests for extended periods of safety equipment unavailability.

(18) The DPO Panel recommended that a lessons learned review be conducted after significant or first of a kind licensing actions to determine if the action should be used as future precedent and/or whether there should be specific attributes identified that future staff should evaluate before using the precedent.

Grading on a (Mobius) Curve

If you read these observations, and the more voluminous supporting text in the report, before reading the conclusion, you’d likely think that the entire panel agreed with Mr. Pruett.

After all, Mr. Pruett contended that the requests departed from regulations and the DPO Panel’s Observations 5, 6, and 14 confirm that contention and several others support it.

Mr. Pruett contended that the requests departed from the agency’s procedures. The DPO Panel’s Observations 1, 8, and 13 confirm that contention and several others support it.

Mr. Pruett contended that the requests departed from the agency’s practices. The DPO Panel’s Observations 9, 10, and 11 confirm that contention and several others support it.

But nooooo. The DPO Panel disagreed with Mr. Pruett.

You might ask why the DPO Panel could possibly have disagreed with Mr. Pruett.

If you do ask and someone gives you a straight answer, please forward it to me. I’ve monitored the NRC for nearly two decades and I cannot fathom how the DPO Panel could assemble so many reasons why Mr. Pruett was right, and yet conclude he was wrong.

It’s like a 19 chapter mystery novel with the first 18 chapters describing how the upstairs maid committed crime after crime only to have the butler—mentioned for the first time—arrested for the crimes.

Perhaps that explains it: the DPO Panel report is an intriguing work of fiction. Or maybe it only needed a non-fictional final chapter.

Agroecology to the Rescue: 7 Ways Ecologists are Working Toward Healthier Food Systems

UCS Blog - The Equation (text only) -

Ivette Perfecto and John Vandermeer in a shaded coffee farm in Chiapas, Mexico. They use diverse shaded coffee as a model system to study ecological complexity and its implications for farm management and biodiversity conservation.

A lot has been written about agroecology, and a new special issue of the journal Agroecology and Sustainable Food Systems takes it to the next level.

The new issue, entitled Agroecology: building an ecological knowledge-base for food system sustainability, expands the conversation by outlining recent progress in ecology relevant for tackling food system challenges ranging from disappearing diversity to water woes to climate catastrophes. Together, the eight included articles demonstrate a range of emerging science-based opportunities that can help farmers and ranchers achieve the triple bottom line: social, environmental, and financial sustainability.  Here are just the highlights of what some farm-focused ecologists have been up to:

  1. Making sense out of complexity: Agroecosystems are complex, and as Vandermeer and Perfecto (2017) explain, “the fundamental rules of natural systems should be used as guidelines for planning and management of agricultural systems.” Fortunately, ecologists have developed some great tools (tools in topics like Turing patterns, chaotic dynamics, and more) that are up to the otherwise daunting task, and agroecologists are busy beginning to put them to work.
  2. Linking biodiversity to farming benefits: Decisions about how land is used at a regional scale can affect farming conditions at a surprisingly smaller scale, influencing even the pollinators and insect pests that are too small to spot unless you’re actually strolling through a field. As Liere et al. (2017) describe, understanding the connections between biodiversity at these different scales is essential to sustaining healthy, multi-functional agricultural systems. Agroecologists have just scratched the surface of investigating these “cascading” effects, and the subject is ripe for more discoveries.
  3. Keeping nutrients where we need them: It’s hard work keeping enough nutrients in some places (such as soils and plants) and reducing them in others (like in lakes and the atmosphere), but getting this right is a key to growing enough food while protecting the environment. Agroecologists tackle these problems with a bird’s eye view, measuring and evaluating everything from study plots to farm fields to watersheds. As Tully & Ryals (2017) note, this approach is critical to finding ways to optimize solutions (such as agroforestry, cover cropping, and organic amendments, just to name a few).
  4. Saving water by planting perennials: Much like nutrients, water often either seems to be overabundant (floods) or far too limited (drought), and climate change research suggests that this problem may only get worse. However, as Basche & Edelson (2017) review, farming practices that ensure “continuous living cover” can build healthier soils that keep more water on farms during dry times, while reducing flooding during heavy rain. Designing farms with water in mind, it seems, could prevent a lot of trouble, benefitting farmers and communities.
  5. Getting more from farming, with less: While adding more “inputs” (seed, water, chemicals) is typically understood to be the path to getting more food from farms, research has shown that this doesn’t always need to be the case. In fact, as Uphoff (2017) demonstrates through a review of the “System of Rice Intensification”, it can actually be possible to get more food by using fewer inputs. As Uphoff explains, “As climate and other conditions constrain agriculture, sustainable food systems will need to evolve.” Thanks to agroecologists, the evolution has already begun.
  6. Tracking down triggers for a food system transition: It’s one thing to find on-farm solutions and another to scale them up. Given that agroecologists have already been uncovering solutions to many of today’s challenges, what’s the next step? As my colleagues and I (Miles et al. 2017) explain, in an ever-changing world where one-sized-fits-all solutions simply won’t work, the research (and the university extension and education that goes with it) must continue to expand. But since research won’t be enough, we also propose several policy ideas (like shifting public research funds, improving conservation programs, etc.) that could help push and pull the food system to a better place.
  7. Exploring how healthier farms can support healthier humans: Much agroecology research to date has been focused on achieving productive farms and environmental sustainability, both of which have clear benefits for human heath (for example, by addressing food security and securing cleaner air and water). But as my colleagues and I (O’Rourke et al. 2017) argue, there’s an urgent need for more explicit research on how healthier farms can improve nutrition and public health. With an expanding agroecological toolbox and an ever-increasing concern about health care costs, perhaps there’s never been a better time!

Kate Tully collects porewater from a lysimeter installed in a farm fields on the Eastern Shore of Maryland to determine how effectively the system is recycling nutrients. Photo: Christopher Blackwood

Agroecologists in action

To close, I wanted to share this excerpt from agroecologist and Agroecology and Sustainable Food Systems Editor Steve Gliessman:

“Ecology has always been the foundation of agroecology. We hope that this Special Issue will encourage more ecologists to engage in ecological research that can impact food system change. Their expertise in the science of ecology can show how an ecological understanding of the design and management of food systems can help us take major steps toward sustainability.”

Or, in other words, three cheers for agroecology! Onwards.

What’s in the special issue:

Agroecology: building an ecological knowledge-base for food system sustainability Agroecology and Sustainable Food Systems, Volume 41, Issue 7, 2017

Editorial: Agroecology: Building an ecological knowledge-base for food system sustainability Steve Gliessman Pages: 695-696 | DOI: 10.1080/21683565.2017.1335152

Ecological complexity and agroecosystems: seven themes from theory John Vandermeer & Ivette Perfecto Pages: 697-722 | DOI: 10.1080/21683565.2017.1322166

Intersection between biodiversity conservation, agroecology, and ecosystem services Heidi Liere, Shalene Jha & Stacy M. Philpott Pages: 723-760 | DOI: 10.1080/21683565.2017.1330796

Nutrient cycling in agroecosystems: Balancing food and environmental objectives Kate Tully & Rebecca Ryals Pages: 761-798 | DOI: 10.1080/21683565.2017.1336149

Improving water resilience with more perennially based agriculture |Andrea D. Basche & Oliver F. Edelson Pages: 799-824 | DOI: 10.1080/21683565.2017.1330795

SRI: An agroecological strategy to meet multiple objectives with reduced reliance on inputs Norman Uphoff Pages: 825-854 | DOI: 10.1080/21683565.2017.1334738

Triggering a positive research and policy feedback cycle to support a transition to agroecology and sustainable food systems Albie Miles, Marcia S. DeLonge & Liz Carlisle Pages: 855-879 | DOI: 10.1080/21683565.2017.1331179

Insights from agroecology and a critical next step: Integrating human health Megan E. O’Rourke, Marcia S. DeLonge & Ricardo Salvador Pages: 880-884 | DOI: 10.1080/21683565.2017.1326073

Energy Department Scientists Barred From Attending Nuclear Power Conference

UCS Blog - The Equation (text only) -

Edwin Lyman, a physicist at the Union of Concerned Scientists, was one of 30 U.S.-based scientists scheduled to speak at the quadrennial International Atomic Energy Agency (IAEA) conference on fast breeder nuclear reactors in Yekaterinburg, Russia, in late June. Lyman did not attend the previous two conferences, in Kyoto in 2009 and Paris in 2013, and was looking forward to rubbing shoulders with hundreds of scientists from around the world, including more than two dozen from U.S. Department of Energy (DOE) national laboratories.

Shortly after he arrived, however, Lyman learned that the 27 DOE lab scientists listed in the conference program were no-shows. One session featuring a panel of four DOE lab scientists talking about code development was cancelled outright, Lyman said, while a handful of other panel discussions, originally comprised of five to six speakers, soldiered on without U.S. participation. “On the first day, the DOE attaché at the U.S. embassy in Moscow gave a 20-minute talk about the U.S. fast reactor program and refused to take questions,” he said. “That was it for the Energy Department.” Three DOE scientists did attend the conference, according to the DOE, but none of them were part of the official program.

Sandra Bogetic, a University of California, Berkeley, doctoral student who presented a research poster at the conference, also couldn’t help but notice that the DOE scientists were missing. Bogetic’s poster session was slated to include presentations by 122 scientists from 17 countries, including a dozen scientists from DOE labs. The DOE scientists were nowhere to be found, and another five DOE scientists missed a second poster session the following day.

“Everyone was in shock that they didn’t show up,” Bogetic said. “It’s the most important conference for fast reactors, and it was a lost opportunity for U.S. scientists to share their work at a conference that takes place only every four years.”

Mum’s the word

Scientists planning to speak or present posters at the IAEA conference were asked to hand in their papers to conference organizers last December, five months before the event. The deadline was then extended into January, and at that point, the 27 DOE lab scientists were all on board to participate.

In early April, however, the DOE scientists received an email from Sal Golub, associate deputy assistant secretary for nuclear technology research and development at the DOE, indirectly telling them that the agency was not going to let them go.

“Yesterday,” Golub wrote, “we informally notified the IAEA conference organizers of the following: Representatives from the Department of Energy’s Office of Nuclear Energy and DOE/NE contractors at the National Labs are currently unable to travel to Russia, which means they will not be able to attend the IAEA’s Fast Reactor conference in June.” He also assured the scientists that the DOE was “working with the organizers to adjust the program to reflect our absence,” which obviously didn’t happen.

Golub gave no reason why DOE scientists were “unable” to travel to Russia, and, when I asked him for an explanation, he referred me to the DOE public relations office. Spokespeople at department headquarters in Washington, D.C., and the Argonne National Laboratory in Illinois — where 15 of the 27 missing DOE scientists are based — were equally unhelpful.

A DOE spokesperson in Washington, who declined to be identified, responded in an email: “We greatly value cooperation with the IAEA and plan to continue to do so whenever we can. The Department of Energy and the [U.S.] Embassy were represented at the event.”

Christopher Kramer, Argonne’s media relations manager, also avoided answering my question. “I can tell you that Argonne greatly values its relationship with the IAEA and plans to continue cooperation whenever we can,” he said in an email. “… From what I understand, Argonne did have two people in attendance at the conference in question.”

I emailed both PR officers back and again asked why the scientists weren’t at the conference. No response.

Finally, I called a random sample of the grounded scientists. It was another dead end.

“I wasn’t able to attend,” one said tersely. “I won’t talk about it.” Click. “We were told not to deal with outside media or organizations,” said another. Click. Two others were slightly more talkative, but neither could clear up the mystery. “I know very little about the decision” to cancel the trip, said one of the scheduled panelists. “It was above my pay grade. I basically followed orders from management.” The other scientist, a would-be poster session participant, was clearly perturbed. “The only reason I know is the [DOE] Office of Nuclear Energy wouldn’t let people go,” he said. “They didn’t give us a reason. I don’t know what their rationale is. Other U.S. government agencies are sending their people to Russia.”

Trump’s war on science or a new cold war?

So what’s the story behind the case of the missing DOE scientists?

It could come down to money. It’s certainly no secret that the Trump administration wants to slash DOE science spending. Just last month, for instance, the department closed its Office of International Climate and Technology, eliminating 11 staff positions. The office, which was established in 2010, provided technical advice to other countries on ways to reduce carbon emissions. The administration’s proposed federal budget, meanwhile, would cut the annual budget of the DOE Office of Science — the nation’s largest funder of the physical sciences — by 17 percent to $4.47 billion, its lowest level since 2008, not adjusting for inflation. Outlays for nuclear energy research would drop 28 percent. Even more drastic, the budget for the department’s Office of Energy Efficiency and Renewable Energy would plunge nearly 70 percent.

DOE spokespeople, however, didn’t cite financial constraints as a reason, and the cost of sending the scientists to Russia was presumably built into the fiscal year 2017 budget, which predated the Trump administration. In any case, Bogetic, the Berkeley grad student, told me that one of the scientists who wasn’t allowed to attend the conference asked the DOE if he could pay his own way. The answer was no.

It’s also tempting to chalk it up to the Trump administration’s war on science. Besides barring federal scientists from attending conferences, according to a new report by the Union of Concerned Scientists (UCS), the administration also has been preventing scientists from speaking publicly, dismissing key scientific advisors, denying public access to taxpayer-funded information, and ignoring scientific evidence to justify rolling back public health, environmental and workplace safeguards. No doubt, the administration’s hostility toward federal scientists may have been a factor.

The most likely explanation, however, is where the conference took place — Russia — and what it was about — nuclear energy.

U.S.-Russian relations, notwithstanding President Trump’s bromance with Russian President Vladimir Putin, have been deteriorating for quite some time. The White House is under investigation for possibly colluding with Moscow to undermine Hillary Clinton’s presidential campaign, and Congress just passed tougher sanctions on Russia for meddling in the 2016 U.S. election, annexing Crimea, and supporting eastern Ukraine separatists.

Nuclear-related relations between the United States and Russia are also frayed. Last October, in response to U.S. sanctions, Putin suspended a U.S.-Russian agreement to dispose of excess weapons grade plutonium; an agreement to cooperate with the United States on nuclear energy-related research; and a pact between the DOE and Rosatom — the Russian state atomic energy corporation — to conduct feasibility studies on converting six Russian research reactors to safer, low-enriched uranium.

Putin’s actions didn’t get much media attention, but they should have. Writing in the Bulletin of Atomic Scientists last December, Siegfried Hecker, former director of the DOE’s Los Alamos National Laboratory, warned that “the Kremlin’s systematic termination of nuclear cooperation with the United States … sets the clock back, putting both countries at enormous risk and endangering global stability.”

Rosatom was the co-host of the June IAEA conference, which was held in Yekaterinburg mainly because the world’s largest operating fast reactor is only 35 miles away, at the Beloyarsk nuclear power plant. Conference participants were treated to a tour of the 880-megawatt BN-800 reactor, which began generating power last year, as well as its smaller predecessor, the BN-600, which has been running since 1980. There are only four other fast reactors currently in operation worldwide: one in China, two in India, and another one in Russia.

The IAEA conference, however, was not Russo-centric. Scientists from more than two dozen countries, including China, France, Germany, India, Japan, South Korea and Sweden, participated. And despite Russia’s suspension of nuclear cooperation with the United States, U.S. scientists were welcome.

“Scientists shouldn’t be limited by political problems,” said Bogetic. “We are scientists. We need to communicate.”

Lyman, the UCS physicist who participated in a panel discussion at the conference, agrees. “With so many communication channels between the U.S. and Russia now cut off, it’s essential to preserve scientific cooperation in areas where there is common ground between the two countries,” he said. “Preventing DOE scientists from attending the IAEA conference — for whatever reason — was shortsighted and ultimately self-defeating.”

This post first appeared in The Huffington Post.

Simorgh Launch: Iran’s Bigger Ride to Space Gets off the Ground

UCS Blog - All Things Nuclear (text only) -

Iranian press has announced a successful launch of the Simorgh space launch vehicle.

A couple days later, there’s no sign of an orbital object being tracked from the launch. It’s not like the US space surveillance to take so long to catalog a low-earth orbiting object, so I don’t think one is forthcoming.

That nothing got to orbit may be either by design or failure. Iran tends not to announce its space program failures, and the video showed at least the early part of the launch went off without catastrophe. In any case, this would be the first successful launch of the Simorgh. We wrote a few pieces on Simorgh last year, in anticipation of its launch then: first, second, third.

What’s interesting about the Simorgh?

So far, all Iran’s satellites have been launched with the Safir rocket, which is significantly less capable than the Simorgh. The Simorgh is closer to North Korea’s Unha, but with two stages instead of three.

Simorgh was meant to be launched in 2010; its conspicuous absence could mean that its development has been harder than anticipated, or that sanctions on ballistic missile and space technology have limited Iran’s ability to get materials it needs, or that there have been test launches that have failed and not been reported. Last year’s test may have been one such failure, or it may have been a suborbital test.

Why would Iran want satellites?

A new satellite would be the fifth for Iran, following Omid (2009), Rasad (2011), Navid (2012), and Fajr (2015). These satellites were all launched by the Safir rocket.

These were all small satellites, 50 kg or lighter, lofted into such low-altitude orbits that atmospheric drag brought them down within weeks. I’ve not seen any data published from these satellites. Perhaps they didn’t work as anticipated or perhaps the results were not impressive enough to burnish program’s reputation.

The Simorgh is larger and more capable than Safir, and can put heavier satellites at higher orbits. Larger satellites mean more capability, and higher orbits mean they will stay up for longer. Iran is a large country with tough geography–big deserts & mountains. It’s prone to natural disasters such as earthquakes, and has adversarial neighbors.  It could benefit from satellites for national security purposes as well as for economic & social development.

The Health Care Vote and the Ethic of Responsibility

UCS Blog - The Equation (text only) -

“It is immensely moving when a mature person- no matter whether old or young in years–is aware of a responsibility for the consequences of conduct and really feels such responsibility with heart and soul. He/she then acts by following an ethic of responsibility and somewhere reaches the point where he/she says: ‘Here I stand; I can do no other.’ That is something genuinely human and moving. And every one of us who is not spiritually dead must realize the possibility of finding themselves at some time in that position.”

I read this passage by the famous sociologist when I was a student of government some thirty-five years ago at Wesleyan University, and I felt it captured the best of public service. But I hadn’t thought about it much until late last night, when Senators Collins, Murkowski, and McCain faced intense pressure to cast votes that would fulfill a longstanding Republican campaign promise, but likely deprive millions of health care.

They found themselves in the very position Weber wrote about in this passage. They rose to the occasion, and they said clearly with their votes, “here I stand, I can do no other.”

I am moved by this demonstration of the ethic of responsibility by these three senators. We all should be. With their votes, these senators drew us back from the edge of a precipice, and said “no” to a process that had sidelined medical and health experts from participating in this momentous decision, perhaps because they had lined up with near unanimity on one side of the issue.

Senator McCain’s rousing floor speech earlier this week, and Minority Leader Schumer’s gracious remarks after the vote, also point to a path forward for fixing our broken politics. It is not a new path; in fact, it is one we followed for many years, and it brought us lasting achievements like the civil rights laws of the 1960’s and the environmental protection laws of the 1970’s.

It is the path that solve problems using a rational, orderly, bi-partisan process, based on facts, science, the competition of the best ideas, and compromise. It is a long and slow path, ill-suited to the intemperate and partisan cravings of the loudmouths that dominate our political debate. But with rare exceptions, it is the only way that works in our democracy works.

It is far too early to know whether this failed Obamacare repeal effort will return us to this better path, and there are many reasons to be skeptical about that. But maybe, just maybe, this act of conscience will set a powerful example that moves both sides away from bitter and polarizing gamesmanship. I hope so, and we at the Union of Concerned Scientists are determined to do what we can do to foster bi-partisan progress on other difficult problems, such as climate change, reforming our food system, and reducing the risks of nuclear weapons.

North Korean ICBM Appears Able to Reach Major US Cities

UCS Blog - All Things Nuclear (text only) -

Based on current information, today’s missile test by North Korea could easily reach the US West Coast, and a number of major US cities.

Reports say that North Korea again launched its missile on a very highly lofted trajectory, which allowed the missile to fall in the Sea of Japan rather than overflying Japan. It appears the ground range of the test was around 1,000 km (600 miles), which put it in or close to Japanese territorial waters. Reports also say the maximum altitude of the launch was 3,700 km (2,300 miles) with a flight time of about 47 minutes.

If those numbers are correct, the missile flown on a standard trajectory the missile would have a range 10,400 km (6,500 miles), not taking into account the Earth’s rotation.

However, the rotation of the Earth increases the range of missiles fired eastward, depending on their direction. Calculating the range of the missile in the direction of some major US cities gives the approximate results in Table 1.

Table 1.

Table 1 shows that Los Angeles, Denver, and Chicago appear to be well within range of this missile, and that Boston and New York may be just within range. Washington, D.C. may be just out of range.

It is important to keep in mind that we do not know the mass of the payload the missile carried on this test. If it was lighter than the actual warhead the missile would carry, the ranges would be shorter than those estimated above.


LA Metro’s Opportunity to Lead

UCS Blog - The Equation (text only) -

Today, Los Angeles Metro, the second largest transit agency in the United States, will vote on a plan to transition its fleet to zero-emission buses. If this sounds familiar, you’re right. It looked as though Metro would vote on this in June, but the vote got bumped to July.

Leading up to last month’s vote, Joel Espino from The Greenlining Institute and I blogged about the importance of this commitment and Metro’s leadership on clean vehicles. Metro’s decision will impact Los Angeles’ efforts to clean the air, fight climate change, and expand economic opportunity. We applaud the proposal put forward by Metro staff to transition the entire fleet to zero-emission vehicles.

So what else has happened in electric bus news this past month? Let’s catch up:

Major labor agreement announced

Last week, Jobs to Move America and BYD, an electric truck and bus manufacturer in Lancaster, California, announced commitments by BYD to create job pathways for underrepresented and underserved populations in Los Angeles County.

The legally enforceable agreement includes a hiring commitment for 40 percent of BYD’s workers to be from populations facing barriers to employment, such as veterans and returning citizens; creation of training and apprenticeship programs for metal work, electrical wiring, and vehicle assembly; and job retention efforts such as transportation options for workers without a car. This agreement sets an excellent precedent for creating good, accessible jobs in the electric vehicle industry.

ARB analysis shows electric buses are cost competitive today

Last month, the California Air Resources Board (ARB) released a draft analysis for the total cost of ownership for electric buses. This analysis takes everything into account from the purchase of a bus to its maintenance and electricity costs.

ARB found that the total cost of owning a battery electric bus in Los Angeles is on par with a compressed natural gas (CNG) bus. Metro’s fleet is entirely CNG today. The spreadsheet released by ARB will be an excellent resource as transit agencies in California and elsewhere analyze the potential financial savings from electric buses.

Buses powered by electricity from Southern California Edison (SCE), the Los Angeles Department of Water and Power (DWP), Pacific Gas and Electric (PG&E), and San Diego Gas and Electric (SDG&E) are cost competitive with today’s diesel and natural gas buses. Both SCE and DWP provide electricity service to LA Metro.

Life cycle emissions analysis shows benefits of electric buses operated by Metro

Our life cycle emissions analysis of electric buses here, here, and here shows that on today’s grid in California, battery electric vehicles are the cleanest option. But what about for LA Metro, which has 9 of 11 bus depots that get electricity from the Los Angeles Department of Water and Power (DWP)?

DWP serves as its own grid operator or “balancing authority,” meaning it oversees electricity generation to meet demand within its service territory. So, a bus charged with electricity provided by DWP will have a different amount of life cycle global warming emissions than a bus charged with electricity from another part of the state (e.g. balanced by the California Independent System Operator).

Concerns have been raised by various groups that DWP’s current power mix, which relies on 24 percent coal (compared to 7 percent in the rest of the state), would neutralize any emissions reductions from the addition of electric buses to LA Metro’s fleet.

Using data from DWP’s Integrated Resource Plan and correspondence with their engineers, we found that on DWP’s grid today, life cycle global warming emissions for an electric bus are significantly lower than emissions from Metro’s current fleet of natural gas buses.

As DWP’s grid gets cleaner (notably with the phase out of coal by 2025 and increasing fraction of renewables to 55 percent by 2030), the life cycle global warming emissions will decrease even further (see graph above).

Metro’s proposal calls for 105 electric buses to be deployed in the next few years and the bulk of its electric buses to be deployed after 2020, meaning life cycle emissions are best represented by DWP’s grid post-2020.

Even more electric bus news in California…
  • AC Transit (Alameda County) recently announced one of its 13 fuel cell buses achieved 25,000 hours of service, far surpassing the expected 4,000 hours and illustrating the durability of fuel cells and electric drive trains.
  • The National Renewable Energy Laboratory released its second report on Foothill Transit’s (San Gabriel Valley) electric bus fleet. The report found maintenance and fuel costs for electric buses were similar if not lower than natural gas buses. The fuel efficiency of electric buses also showed an eight-fold improvement over natural gas buses operating on the same route. Averaged over all routes, battery electric buses showed a four-fold improvement in fuel efficiency over natural gas buses.
  • And just yesterday, Proterra opened its West Coast battery electric bus manufacturing facility in the City of Industry outside of Los Angeles, expanding the company’s current capacity with its East Coast manufacturing facility in South Carolina and a battery facility in the San Francisco Bay Area.

This is an exciting time for clean vehicles and public transit. We encourage Metro to seize the opportunity to be a leader in fighting global warming and air pollution by adopting a strong plan to transition to zero-emission buses.

Marijuana and Nuclear Power Plants

UCS Blog - All Things Nuclear (text only) -

The Nuclear Regulatory Commission (NRC) adopted regulations in the mid-1980s seeking to ensure that nuclear power plant workers are fit for duty. The NRC’s regulations contained provisions seeking to verify that workers were trustworthy and reliable as well as measures intended to prevent workers from being impaired on duty. The former measures included background checks before workers could gain access to the plant while the latter components included drug and alcohol testing.

The regulations require that nuclear plant owners test workers for marijuana and alcohol use at the time of hiring, randomly thereafter, and for cause when circumstances warrant it. In 2014, marijuana use was the #1 reason for positive drug and alcohol tests by contractors and vendors and was the #2 reasons for positive tests by nuclear plant employees. Positive tests for alcohol are the #1 reason for positive tests by employees and the #2 reason for positive tests by contractors and vendors. A positive test may not be a career killer, but it is often a career crimper.

Fig. 1 (Source: Nuclear Regulatory Commission)

Alcohol can be legally purchased and consumed in all 50 states. So, mere detection of having used alcohol will not result in a positive test. But detection of a blood alcohol concentration of 0.04 percent or higher yields a positive test. People have different metabolisms and alcoholic beverages come in different sizes, but that threshold is often equated to having consumed one alcoholic beverage within five hours of the test. Similar to the reason that states require motorists to not drive under the influence of alcohol (i.e., don’t drink and drive), the NRC’s regulations seek to control alcohol consumption by workers (i.e, don’t drink and operate nuclear plants.)

Unlike the reason for the alcohol controls, the NRC’s ban on marijuana use is not because it might make them more likely to make mistakes or otherwise impair their performance, thus reducing nuclear safety levels. The NRC banned marijuana use because at the time marijuana was an illegal substance in all 50 states and its criminal use meant that workers fell short of the trustworthiness and reliability standards in the fitness for duty regulation. Since the NRC adopted its regulation, 8 states have legalized recreational use of marijuana and another 12 states have decriminalized its use.

Fig. 2 (Source: NORML)

The NRC recognized that marijuana’s legalization creates potential problems with its fitness for duty regulation. If an individual uses marijuana in a state that has legalized or decriminalized its use but tests positive at a nuclear plant in a state where its use is not legal, is the individual sufficiently trustworthy and reliable? In the eyes of the NRC, the answer remains yes.

Fig. 3 (Source: Nuclear Regulatory Commission)

The NRC conceded that no comparable scientific basis links marijuana use to performance impairment as existed when the alcohol limits were established. But the NRC continues to consider marijuana use as indicating one lacks the trustworthiness needed to work in a nuclear power plant.

The NRC is in a hard spot on this one. Revising its regulations to eliminate marijuana as a disqualifier for working in a nuclear power plant would likely spawn news reports about the agency permitting Reefer Madness at nuclear plants. But the country’s evolving mores are undermining the basis for the NRC’s regulation.

Climate and Energy in the Health Care Sector: An Interview with Bill Ravanesi

UCS Blog - The Equation (text only) -

Health care has been in the headlines a whole lot lately, and it’s never far from our minds or wallets. It’s never far from our lungs or hearts, either—or, it turns out, our energy choices. How we make electricity, and what happens to our climate, have big implications for human health.

Our health care sector isn’t taking those connections lightly. Here’s what one expert had to say about how Massachusetts institutions are leading the way on connecting the dots.

Health care, as you might agree, is a big deal. It’s a $3.2 trillion piece of our economy, 18% of GDP. Asthma affects some 25 million US residents. More than 100 million people live in US counties that earn an “F” on ozone pollution from the American Lung Association. Meanwhile, power generation is a major source of air pollution and heat-trapping gases like carbon dioxide, and climate change has serious health implications of its own.

All that means that the health care sector can be a powerful source of positive momentum when it focuses on climate and energy issues.

Bill Ravanesi is the senior director of the health care, green building, and energy program of Health Care Without Harm. HCWH is a Massachusetts-founded organization that campaigns for environmentally responsible health care globally, a coalition of 450 health-related organizations from around the world “working to transform the health care sector, without compromising patient safety or care, to be more ecologically sustainable.” (More on Bill and HCWH below.)

Energy, resilience, and health

Bill is passionate about his work and excited about the progress being made. I had a chance to talk with him recently about climate, energy, and health care institutions, and in particular what’s going on at the intersection of those subjects in Massachusetts, a state known for its leadership in all three.

I asked Bill for his take on why it made sense for health care organizations to be thinking about climate and energy, and investing in solutions.

Well, they want to be anchoring community health and resilience. They feel very strongly that they’re part of their community, their neighborhood, and that they have to be there under all circumstances—24/7. I can tell you, many of these engineers see the patients as their patients. I’m not talking about the doctors. The engineers see people in beds in hospitals as their patients and work from that model… And they certainly recognized what happened to patients in both [Hurricane] Katrina, and Superstorm Sandy!

Some health care institutions in Boston have already taken a leadership position on being resilient—for what we’re going to see with sea-level rise, extreme heat, and precipitation in this area from climate change.

So that supports the idea of making the facilities resilient, using clean and efficient energy options to make sure energy is there when its needed, in the face of climate impacts like sea-level rise, including places like Boston. What about deals for renewable energy, either on their own facilities, or elsewhere, sometimes from several states away?

It makes sense on two counts to them. It is saving them money… and that’s carved out for the next twenty years not to go up. So, it makes sense economically to be doing this… It’s a win-win financially for them.

And then of course, health is part of their mission, and they see climate and health as a unit. And with renewable energy and reducing greenhouse gases, you’re reducing the pollution, you’re reducing the number of asthma cases coming into the hospitals, you’re reducing all kinds of respiratory illnesses, etc. In fact, it crosses a large arc of what happens with adverse health effects, from the heart to the neurological, to you-name-it. It’s a whole series of things here, so they’re being protective of their community.

You can’t look at energy costs or energy investments without thinking about the implications for human health, says Bill, for the near or long term:

In the state of Massachusetts, households spend six times more per household on health care than they do on energy. So, if you see the [Massachusetts] Department of Public Utilities or utilities, or whomever else is controlling, moving pieces around, like bringing in a new natural gas pipeline… you’re going to be shifting costs from energy into health care because we have to take care of the individuals who are going to be breathing in the pollution, the adverse health effects from fossil fuel development.

A 1.25 megawatt solar array on Partners Healthcare’s office complex in Somerville, Massachusetts, part of Partners’s 9 megawatts of installed solar capacity. The system generates the equivalent of 15% of the complex’s electricity use (Credit: Partners Healthcare).

Making strides, building momentum

So what’s actually happening? Plenty, says Bill, as he easily rattles off information about recent moves by some of the Boston area’s top health care facilities to put energy efficiency, renewable energy development, and carbon emission reductions front and center:

The Boston Medical Center, this past December, closed a deal to buy a significant piece of the output from a 60 megawatt solar field in North Carolina… This purchase will neutralize 100% of [the carbon emissions from] electricity consumption for BMC, putting them on target to be carbon neutral by the end of 2018.

Partners Healthcare has just done another deal for output from a 29 megawatt wind farm just over the Massachusetts line in New Hampshire…

Partners is also putting PV [solar photovoltaics] on most of their facilities. They have 13 or 14 different facilities around the state, from Cape Cod all the way into Boston—Mass General, Brigham and Women’s, and Spaulding Rehabilitation Hospitals. Partners’s goal is to be 100% renewable energy-powered for their entire healthcare system by 2025.

And resilience, from flooding, for instance, is an important piece of the energy work. Bill uses Partners as an example:

They have moved all of their critical electrical facilities out of the basements, up to higher elevations (out of harm’s way for flooding). So, if you look at the Spaulding Rehab, that’s running on a cogen unit [combined-heat-and-power system], when the grid goes down, they can still be operating. They put the cogen unit on the eighth floor; it’s 110 feet above sea level. I don’t think we’ll ever see a surge that high… Spaulding is considered one of the most resilient hospital buildings in the United States.

Vision for the long haul

These health care institutions, Bill says, have the longer-term perspective that the challenge of climate change calls for, and that is in keeping with Massachusetts’s landmark Global Warming Solutions Act, which lays out 2020 and 2050 goals for cutting state carbon emissions:

So, some of the leading facilities in Boston have this kind of vision going forward that they’re not just looking at the year 2020, as to what they can do. They’re looking at the year 2050. They’ve charted this out—what they need to do and when they need to do it to meet the Global Warming Solutions Act’s mandated target. And of course they’re going to be way ahead on the 2020 goal of 25 percent reduction in greenhouse gas emission reductions.

Where does this kind of leadership in health care/climate/energy go from here? The sky is the limit, says Bill—or the globe.

We see Boston as the incubator. We take whatever the initiatives are—we try to run them here. And if they’re successful here, we run them nationally. If they’re successful nationally, we run them globally. So, it’s a great paradigm.

Of course this work doesn’t happen in isolation; my HCWH colleague Paul Lipke is a partner in these accomplishments. And I want to acknowledge Mariella Puerto of the Barr foundation for her belief in and support for this work.

A recent analysis produced by HCWH for Boston’s Green Ribbon Commission documents a lot of successes already achieved in the local health care sector in terms of clean energy and resilience. The many institutions that Bill and his colleagues collaborate with, it suggests, are blazing a strong trail.

The health care industry has the power to appreciably move the needle on climate and energy progress. Leaders in the sector are harnessing that power, to the benefit of their communities and the world as a whole.

More on Bill Ravanesi: Bill has been with HCWH since 1997, and has received numerous awards, including the CleanMed Environmental Health Hero Award in recognition of his role in deepening our understanding of the critical links between health and the environment, and the USEPA’s Environmental Merit Award for outstanding efforts in preserving New England’s environment. Bill has a master’s degree in environmental health from the Robert Wood Johnson Medical School, and produced the national traveling exhibition and monograph Breath Taken: The Landscape & Biography of Asbestos.

More on Health Care Without Harm: HCWH’s three main goals for the next five years, says Bill, are to protect public health from the effects of climate change by reducing health care’s carbon footprint and accelerating “climate resilient” health systems; to transform the supply chain by establishing a core set of procurement criteria for low-carbon, zero-waste, and toxic-free products; and to activate healthcare’s leadership in society as a messenger for environmental health and climate change.

Maine Benefits From Space Observations: Will Congress Axe Them?

UCS Blog - The Equation (text only) -

The U.S. House of Representatives appropriations committee approved of a budget that, according to figures my colleague Hannah Nesser calculated, includes over a quarter cut from NOAA’s National Environmental Satellite, Data, and Information Service (NESDIS) Systems Acquisition funding compared to previous fiscal year enacted level.  What exactly is on the chopping block for this and other cuts to NOAA and NASA?  Are any vital to key economic sectors in Maine?

Maine tourism sector Maine coast

Enjoying the Maine coast (July 2017). Photo by Erika Spanger-Siegfried

This is the season when the population of Maine swells as the state welcomes many visitors to its beautiful coastline, inland lakes teaming with fish and mountains to explore.  I am also drawn to the special character of a Maine summer.

I look at a fuzzy black and white image from early in the last century and recognize the tree that extends out over the water and takes a sharp turn upward.  I have paddled by this tree over many summers stretching back to childhood.

It is not surprising that a distinctive tree figures in my experience of Maine, as it is the second most forested state (percent tree cover) in the continental U.S.  Observations from space such as Landsat data support Maine forest stock assessments.  Those forests help keep the lakes healthy for fish and the common Loon that seeks large clear lakes in the summer.

Tourists also flock to salty estuaries, rocky cliffs, and especially the beaches.  There is plenty for visitors to explore since Maine boasts the fourth largest coastline in the nation.

I appreciate my colleague Rachel Licker pointing me to the latest data on tourism.  Tourists in 2016 supported nearly 106,000 jobs in the state.  That year brought around 6 billion in revenue with the total impact of tourism as high as nine billion dollars.  To help keep visitors and residents safe on Maine roads and eating healthy seafood, sensors far above Maine whizz by on satellite orbits – largely unnoticed.

Maine Universities, NOAA, NASA and some direct benefits from space observations Freeze thaw graphic by NOAA

Accurate and early warnings of freeze and thaw conditions could improve rural road restrictions with earth observations from space. Source: NESDIS/NOAA

Professors and their students at Maine academic institutions along with specialists at NOAA and NASA pay close attention to key satellites and the data they provide.   This information greatly benefits Mainers and Maine’s economy.  Here are just two examples of some direct benefits derived from earth observations from space that are calibrated by research support on the ground or in the ocean:

  • National Center for Environmental Information (NCEI) is part of NESDIS which is part of NOAA under the Department of Commerce.

The state of Maine benefits from critical freeze-thaw conditions supplied by NCEI in order to adjust the weight limit on roads less well traveled.

This information helps prolong the life span of critical road infrastructure in the rural parts of Maine.  It also helps to reduce the time for road restrictions benefiting local businesses and freight industry decisions with more advanced lead times.

  • The Main Space Grant Consortium.

Maine is the largest supplier – 62 % in 2014 – of soft-shelled clams in the US. Yet, Maine suffers losses in commercial revenue – millions of dollars a year – where shellfish harvesting areas are closed due to toxic conditions.

To help address this issue, NASA awarded the Maine Space Grant Consortium to study “Multi- and hyperspectral bio-optical identification and tracking of Gulf of Maine water masses and harmful algal bloom habitat.”

The consortium’s research is critical for improving future satellite missions such as PACE which could provide earlier and more accurate warning systems to prevent people from eating unhealthy shellfish.

As members of Congress dig into the proposed budget cuts, now is the time to ask which aging satellites are at jeopardy of not keeping on track to be replaced in time to ensure seamless coverage of the state and prepare for new and improved sensors?

What plans are in place to ensure healthy Maine shellfish for human consumption going forward?  Is that program for reducing road restrictions for spring shipments in Maine funded at an operational level?  Many livelihoods depend on these answers.

Photo by Erika Spanger-Siegfried NESDIS/NOAA

Lessons for Fighting the Trump Administration’s Attacks on Science

UCS Blog - The Equation (text only) -

With all the recent headlines about the Trump administration’s attacks on the government scientific enterprise—from dismissing scientists from advisory committees, to hiring untrained or conflicted heads of agencies, to blatant misinformation from administration officials—it can be difficult to think about the solutions. But we must. My new paper, out this week in Conservation Biology, does just that. 

The red-legged frog was one of several species that got caught up in politics during the George W. Bush administration in the US. Several administrations in the US, Canada, and Australia have had issues with political interference in science policymaking. Photo: USFWS

While many of the Trump administration’s attacks on science seem unprecedented, we can draw many lessons from past administrations’ hostility toward science—both in the United States and outside of it.

In “Defending Scientific Integrity in Conservation Policy Processes: Lessons from Canada, Australia, and the United States,” my coauthors and I lay out lessons for advancing scientific integrity in government science policy, with a focus on conservation policy processes.

The paper is being released just proceeding next month’s meeting of the Ecological Society of America meeting, where I’ll be moderating a panel on current attacks on the Endangered Species Act and how scientists can engage in policy decisions.

Defending scientific integrity in conservation policy

Here are some of the paper’s key findings for what governments should do to advance scientific integrity in decision-making around conservation:

  • Strengthen the policies that grant government scientists the right to speak
  • Guarantee public access to scientific information
  • Strengthen agency culture of scientific integrity
  • Broaden the scope of independent scientific reviews
  • Enhance transparency around conflicts of interest around scientific advice
  • Proactively engage with scientific societies
The “political interference in science” playbook

While many of President Trump’s recent moves have raised concerns about the future, when it comes to the administration’s treatment of science, we must remember that in many ways we’ve been here before—both in the US and outside of it.

In terms of conservation policy, we don’t have to look to far to find past examples of interference in government decision making. Many of you may remember Julie MacDonald, the political appointee from the George W. Bush administration who (among other offenses) tampered with a scientific document supporting an endangered species listing for the Gunnison sage grouse.

This was just one of several examples of political interference in US endangered species policies. Other species that got tied up in politics at the time were the bull trout, right whale, marbled murrelet, trumpeter swan, polar bear, and red-legged frog. Canada, in the Harper administration, and Australia, under the Howard administration, also experienced political interference in government science.

Many of these cases demonstrate the same thing: that political forces can exploit weaknesses in the policy process in order to sideline inconvenient science. A lack of transparency in the process, inappropriate access to scientific documents by political officials, lack of access to government scientists, and lack of collection of or adherence to science advice, for example, can create conditions that make it easier for politics to intrude in what should be science-based decision making.

Learning lessons, finding solutions

During and following the many and varied attacks on government science under the George W. Bush administration, the Union of Concerned Scientists and others got to work developing policy solutions. It was clear that the system had vulnerabilities that allowed such interference to happen. What kinds of policies could have prevented such blatant intrusions of politics into science policy making?

There were lots of issues to address. But we learned a thing or two. Canada and Australia, too, learned from their former leaders who were hostile to science.

The paper lays out some of the solutions found to be common among the three countries. The lessons teach us that while damage can certainly be done during such times where science is silenced, sidelined, or manipulated; there are ways to move forward and policies that can be put in place to prevent such future abuses of power.

The path forward: keeping science in conservation policy

While we might fear what the Trump administration will do when it comes to conservation policy, in many ways we know the playbook. We know what to watch for and where the vulnerabilities are. We also have new protections and a federal workforce who intends to do their jobs.  So let’s continue to think about and advocate for solutions. It is our only hope.

Rising Seas Erode Homes and History in Alaska—Let’s Talk Relocation

UCS Blog - The Equation (text only) -

Every sourdough tastes unique.

Sure, they all share the same foundational ingredients – water, flour, and sugar – but the wild yeasts and bacteria that ferment the sugar to create that tart flavor are place-specific. A sourdough baguette from San Francisco, with the Pacific Ocean’s salty breeze sneaking into pantries, tastes different than that of a boule baked at the high altitude of a Denver bakery.

The best sourdough I’ve ever tasted comes from an unassuming bucket tucked underneath Cliff Weyiouanna’s kitchen sink in Shishmaref, Alaska.

The dough is nearly a century old (96 years to be exact), and has been passed down from one generation to the next. It’s rich and tangy in flapjack form, and fills the house with a fresh yeasty smell as Cliff flips them over the gas stove. Cliff has cooked pancakes for hundreds of visitors to Shishmaref – he has the guest books to prove it. Scientists from Japan, hunters from Texas, journalists from Norway, you name it and they’ve sat at Cliff’s breakfast table for a tall stack of sourdough jacks and a strong cup of coffee.

I sat at that kitchen table on a brisk August morning in 2016 with my research partner, Cliff’s girlfriend, and Cliff. The Summer Olympics were playing on a small TV in the corner and freshly picked buckets of berries crowded the floor space waiting to be frozen for winter as we tucked into steaming plates of pancakes.

They were delicious, but we weren’t there for the sourdough jacks. We had come to Shishmaref as part of a year-long research project to understand how erosion and sea level rise are affecting communities across the United States and US Territories.

Through interviewing hundreds of Americans from Alaska to American Samoa, our aim was to identify what is needed at the national level to support towns in need of relocation away from America’s eroding edges.

A few days before Cliff invited us for breakfast, the village of Shishmaref decided in a 94 to 78 vote to relocate in full from its current site onto the Alaska mainland five miles away. Shishmaref sits on a narrow barrier island in the Chukchi Sea. At points, the island measures barely a quarter mile wide.

Shishmaref has been losing land to the sea from natural erosion trends for hundreds of years. But with climate change, that natural trend is getting a lot worse.

The U.S. Army Corps of Engineers constructed a rip rap sea wall to protect much of Shishmaref from 2005 to 2009. The project is the latest in a number of sea walls constructed to try to slow the rate of erosion on Sarichef Island. Photo: Eli Keene

Relocation is now

Climate change is amplified in the Arctic with air and sea temperatures warming twice as fast as most other places on earth and Alaska is no exception. A primary reason for amplification is the surface albedo feedback – the melting of snow and ice that turn white, sunlight reflecting surfaces into darker, heat absorbing spots.

All that heat is melting ice and thawing permafrost, frozen ground, at an unprecedented pace. In normal years in Shishmaref, an icepack usually develops in the fall months around the island. This ice has always acted as a buffer against severe storm surges, forcing waves to break miles off shore instead of against the village.

As the ice disappears, so too does this natural defense.

This loss of ice – combined with the effects of thawing permafrost, softening the very land the village is built upon – have resulted in a loss of three to five feet of shoreline per year, with a single severe storm washing away 50 feet of land. Storms caused such severe erosion in 1997 and 2002 that some homes fell into the ocean and several more needed to be moved.

Talking to Cliff about the recent vote, it’s clear that he’s had this conversation many times before. At 74, he’s witnessed a lot of talk about relocation, including an effort by the community to relocate in 2002 which was later abandoned due to lack of funding.

“When people asked how did I vote, I say, ‘I know we’re not going to get funding from the state or government so I voted to stay,’” he tells us. “I don’t know [if they’ll find funding]. It’s gonna take a long time. They have to go back and test the soil. Last time they had someone from the government drill test every mile. There was three feet of soil and two feet of ice there, so that wasn’t stable. Now they have to work on building the road. It ain’t going to be easy.”

It’s easy to fault Cliff’s vote to protect in place rather than move Shishmaref. An Army Corps of Engineers Alaska Village Erosion Technical Assistance program assessment in April 2006 estimated that Shishmaref had 10 to 15 years before their current site is lost to erosion. And while a recently built gabion seawall will buy the village time, the threat of inundation is imminent and inevitable.

But it’s been 10 months since I was last in Shishmaref. 10 months since their vote to relocate, and no money has materialized to help them move.

Acting City Clerk, Tiffany Magby, reads out the final votes cast in Shishmaref’s 2016 referendum on relocation. Photo: Eli Keene

Lacking federal support for Alaska  

The cost of relocating Shishmaref in full is estimated at $180 million. And Shishmaref isn’t alone. In Alaska, 31 villages are identified by the Army Corps as in imminent threat of being uninhabitable. In Louisiana, Washington, Virginia, and Florida, coastal communities are already having difficult conversations about when managed retreat inland should become their climate change adaptation strategy.

At present, there are at least 13 towns and villages in America that have decided to relocate in part or in full due to the effects of climate change.

These towns may be the first to relocate from rising tides – but they won’t be the last.

UCS’s recent report When Rising Tides Hit Home calculates that within 45 years, by 2060, more than 270 coastal US communities – including many that seldom or never experience tidal flooding today – will be chronically inundated given moderate sea level rise. By the end of the century, that increases to 490 communities, including 40 percent of all East and Gulf Coast oceanfront communities.

At the onset of our project, the aim was to pinpoint particular policy and funding solutions to encourage federally supported, locally implemented climate-induced relocations that would feed into work already being supported by the White House.

We had hoped to provide our findings to an interagency working group on community-led managed retreat and voluntary relocation that President Obama established to develop a framework and action plan for managed retreat.

I wish that this intended goal was still possible. Unfortunately, it is not.

It is clear that the Trump Administration is not interested in protecting the American citizens in Shishmaref, or in any other coastal town across our country, from the effects of climate change we can no longer avoid. His proposed budget plan eliminates key programs for coastal adaptation research and capacity building like the National Sea Grant College Program; zeros out the budget for the Denali Commission, the independent federal agency mandated to facilitate climate-induced relocation in Alaska; and cuts dozens of EPA programs, including infrastructure assistance to Alaska Native villages.

While these actions can be demoralizing, we cannot give up on pressuring this Administration to act on climate adaptation and relocation. President Trump may not believe in protecting American citizens from the impacts of a warming world. But there are hundreds of civil servants and scientists who are still dedicated to helping those in need.

Civil servants like Joel Clement, former director of the Office of Policy Analysis at the U.S. Interior Department. Joel has worked for seven years to help endangered communities in Alaska like Shishmaref prepare for and adapt to climate change.

But last week, Mr. Clement was reassigned to an ill-fitted role in the Office of Natural Resources Revenue as retaliation for speaking out publicly about the dangers that climate change poses to Alaska Native communities. As he says in a recent op-ed in the Washington Post, “During the months preceding my reassignment, I raised the issue with White House officials, senior Interior officials and the international community, most recently at a U.N. conference in June.”

Federal scientists like Joel need our support and advocacy now more than ever before.

We must stand up for science and hold the Trump Administration accountable for silencing civil servants from doing their jobs. That means calling on our elected officials to join together to support empowered communities like Shishmaref to rise above the rides of climate change.

Shishmaref, as seen from just south of the sea wall.Photo: Eli Keene

Solutions for Alaska

Republican Senator Lisa Murkowski of Alaska recently spoke in Juneau, the state’s capital, about the need for climate change action “because we see it here in this state and it is real and I think we’ve got an obligation to help address it.” And last year, Senator Murkowski supported President Obama’s request for $400 million “to cover the unique circumstances confronting vulnerable Alaskan communities, including relocation expenses for Alaska Native villages threatened by rising seas, coastal erosion, and storm surges” in his final budget request to Congress.

Senator Murkowski’s proposed Offshore Production and Energizing National Security Act of 2017, or the OPENS Alaska Act of 2017  primarily aims to increase offshore oil production. But the bill would also direct 12.5 percent of the revenues from offshore development to a newly established Tribal Resilience Program to promote resilient communities through investments in energy systems and critical infrastructure to combat erosion, improve health and safety, and foster resilient communities.

Alaska Native communities on the frontlines of climate change need Senator Murkowski to be much more of a leader on this issue. They need her to educate her senate colleagues on the impacts they are already facing, and champion a strong, well-funded national climate relocation framework.

We also need to begin a conversation about non-governmental solutions to supplement federal and state action on climate-induced relocation in America. We must call on all sectors – private, non-profit, volunteer, philanthropic – to join together to support empowered communities like Shishmaref rise above the tides of climate change.

For example, the Rockefeller Foundations 100 Resilient Cities initiative, which was launched in 2013, aims to help cities around the world become more resilient to the physical, social, and economic challenges that are a growing part of the 21st Century.

Another example is Community Engineering Corps, a partnership between Engineers without Borders, the American Society of Civil Engineers, and the American Water Works Association to bring underserved communities and volunteer engineers together to advance local infrastructure solutions in the United States.

NGOs with legal expertise or pro bono divisions of large law firms could provide partnerships to help towns navigate the legal challenges of retreating inland. And places like the National Trust for Historic Preservation and the Society for American Archeology could help to ensure that cultural heritage, historic sites, and local traditions are included in the relocation road map, effectively protecting them, or documenting them with dignity when saving them is not possible.

Edwin Weyiouanna, an award-winning carver from Shishmaref, works on a piece of moose antler. Photo: Eli Keene

Cultural diversity for climate resilience

Ultimately, when I think of the future of coastal communities as seas rise, I don’t think of DC; I think of Cliff and the fresh yeasty smell of sourdough flapjacks for breakfast.

Maybe it’s strange that sourdough is the first thing that comes to mind when I think of the impacts of climate change on America.

At first glance, the biggest loses from climate change are the ones we can see. They are the fallen house into the ocean, the flooded streets after the hurricane, the disappearing edges of America on the maps of our country. These tangible images are what we recall when we think of what stands to be lost as the seas rise.

But there are some things that can’t be rebuilt – the place that you learned how to plant tomatoes with your grandfather that’s now too salty to grow vegetables. The historic buildings that have stood for centuries now under water. The identity of your town as a seaside community and the close-knit bonds within it that let you ask your neighbor to water your plants when you go on vacation.

The unique taste of sourdough that’s been living on Shishmaref for 96 years.

Residents learn to make traditional handicrafts from seal skin at a free workshop. Photo: Eli Keene

This – these local cultures and heritage – this is what the hundreds of people I’ve interviewed spoke about when asked what they are afraid of losing to encroaching seas. And it’s what I think about when President Trump slashes all funding support for protecting American communities from climate change.

The mass loss of history and cultural diversity may seem less important than the billions of dollars of infrastructure damage climate change will cause. But losing our cultures and histories isn’t just about losing part of who we are. It’s also about losing part of our ability to adapt to a warmer world.

Just as the biodiversity of plants and animals improves the resilience of ecosystems, cultural diversity offers a resilient knowledge base for adapting to and counteracting the effects of climate change.

Learning from traditions and history has always been an important part of envisioning a better future. Using cultural practices that have been passed down from generation to generation to adapt to a changing climate is no different.

Cliff’s sourdough passed down from his parents may not help in Shishmaref’s adaptation, but the lessons they passed down about reading the safety of ice conditions will.

Coastal communities across America already have the vision and multigenerational knowledge to adapt to the effects of climate. What they do not have is time to waste on an inactive government. They need the financial support and technical tools to implement their vision. Those of us in privileged positions need to pressure the Trump Administration and Congress to take seriously the issue of relocation before it’s too late.

Victoria Herrmann is the principle investigator for America’s Eroding Edges, a research and storytelling project on the impacts of climate change on coastal communities  livelihoods, and cultures. She is also the President & Managing Director of The Arctic Institute and a Gates Scholar at the Scott Polar Research Institute at Cambridge University.  

Electric Cars Are Critical to a Clean Future

UCS Blog - The Equation (text only) -

Electric vehicles (EVs) are an important part of how we will reduce climate-changing emissions, air pollution, and petroleum consumption. Are they the only way we will cut pollution from personal transportation? Of course not. EVs are critical, but we’ll also need to be smart about using urban design, transit, and shared mobility to reduce the amount of driving from all vehicles. However, a recent U.S. News & World Report article puts EVs in a false competition with these other strategies, while also repeating myths about the environmental impacts of EVs.

EVs reduce emissions now

On average, EVs on the road today produce less global warming emissions than the average new gasoline car.

The emissions do depend on where in the U.S. the EV is used, because electric power generation comes from different sources depending on the region. Because many of the EVs have been sold in regions with cleaner power (like California), the EVs being used today are, on average, responsible for fewer emissions than any gasoline-powered car.

Based on sales through 2016, the using the average EV is responsible for global warming emissions equal to that of a 73 MPG gasoline car.

EVs are still responsible for fewer global warming emissions, even when you consider the additional energy and materials needed to manufacture the batteries that power EVs. We found that these extra emissions are offset quickly by savings during use; on average after 6 to 18 months of use.

There are also other concerns mentioned in passing in the U.S. News article, such as the impact of mining for battery raw materials. But the negative impacts from raw material extraction are largely due to lax regulations and can be addressed through better policy and corporate responsibility.

For components like cobalt and rare earth metals, all high-tech consumer product companies need to ensure that they have environmentally responsible supply chains that also protect the rights and health of those impacted by mining. This is as true for Apple and Samsung as it is for EV manufacturers.

There have been positive developments from batteries suppliers and technology companies, but they can and should do more to ensure responsible battery production.

At the same time we also need to consider the negative impacts of gasoline production, from human rights abuses to massive environmental disasters during oil extraction, to the unavoidable air pollution damage from refining and burning gasoline in our cars. All our personal transportation fuels – gasoline, diesel, biofuels, or electricity – can be cleaner if fuel producers are held accountable to reduce their pollution.

Moving to EVs faster will help to reduce emissions even more

Another attack on EVs in the U.S. News article is that EVs only make up a small fraction of the vehicles on the country’s roads today. This is true, but is not a reason to turn back. The first mass-market EVs only went on sale at the end of 2010. From those two models (Chevrolet Volt and Nissan LEAF), the market has now grown to some 30 EV models available today. However, many of these EVs are not sold nationwide and are not marketed effectively.

In one notable case, Fiat Chrysler has decided to not even let customers outside of California know that it’s new minivan comes in a plug-in version.

Still, EV sales are increasing and hitting new milestones, especially in places with strong regulations and incentive programs like California where manufacturers have also placed much more effort to sell EVs (when compared to the rest of the U.S.)

In the first quarter of 2017, EV sales in California were nearly 5 percent of all new car sales and for some manufacturers were much higher. For example, General Motors’ Chevrolet brand had plug-in cars make up over 15 percent of all new sales in the first 3 months of 2017.

Having more options for new car buyers to pick a plug-in car will only help make the market grow. And it’s important for the market to grow as quickly as possible. Because cars often stay on the road more than a decade, it’s critical to speed up the transition from petroleum to electricity.

The future is electric, but also needs shared transportation

The future of driving is electric. It’s not just our opinion at UCS, both car companies and governments realize that EVs are the future. CEOs of Ford and  VW have gone on record with predictions of high volume EV sales. And France, Norway, and India are among the countries that have set impressive goals to transition to EVs.

But EVs alone aren’t enough to meet our climate goals. It’s important to also reduce the impact from transportation by reducing the number of miles we drive, even from electric cars. Shared transportation, whether via transit, carpools, or new ridesharing services, will also be important to make significant reductions in pollution. But this is in no way in competition with EVs. Instead, EVs are complementary to many of these shared transportation options.




Seize the Day: RGGI Leadership More Important Than Ever

UCS Blog - The Equation (text only) -

A pioneering program to reduce power plant emissions in the Northeast is poised to enter a new phase. Here’s why the nine states of the Regional Greenhouse Gas Initiative need to make as bold a step forward as possible—and how they can make it happen.

The RGGI states in the Northeast and Mid-Atlantic—Connecticut, Delaware, Maine, Maryland, Massachusetts, New Hampshire, New York, Rhode Island, and Vermont—were right to lead the nation in addressing carbon pollution from power plants when they launched the program in 2009. And they were right to strengthen RGGI when they conducted the first program review in 2012.

RGGI has yielded clear results, as shown in analyses by the program itself and outside analyses.

Now the states are nearing the finish line of the second RGGI program review. And the need for stronger action by this important collection of states is even clearer.

Why it’s important

This is a time of incredible momentum in clean energy, with cities, states, utilities, companies, individuals, and more embracing so much of what’s possible. Technologies, policies, and actions are leading us to new heights on energy efficiency, renewable energy, and clean cars.

This is also a time of incredible need. With the Trump administration abdicating on climate leadership in pulling out of the Paris climate agreement, and working to kill the first-ever federal regulations on power plant pollution, the Clean Power Plan, regional leadership on climate and energy is more important than ever.

Technologies mean RGGI states can make much more progress, and quickly. (Credit: J. Rogers)

What RGGI state leaders must do

So this program review represents an incredible opportunity to strengthen RGGI. The Union of Concerned Scientists and its allies are calling for strengthening in several key areas:

A stronger target – RGGI’s defining characteristic is its declining regional cap on power plant carbon emissions. The region has a history of defining the cap much higher than circumstances—emissions and trajectories—warrant. And the emissions targets as currently set will not get the states the long-term emissions reductions that many have set via legislation, and that many of the region’s governors have committed to collectively.

In a recent letter to the RGGI governors, our coalition called on them to have the cap levels:

…reflect actual emissions levels and trends at its start, including emissions reductions that have outpaced earlier projections, and align with state and regional [greenhouse gas] targets in 2030.

What that means is something considerably stronger than the 2.5% annual tightening of the cap that has been in place since the program’s launch—at least 3.5% seems warranted—and an extension of the cap through at least 2030. The RGGI states, their utilities, innovators, and customers have proven their ability to cut emissions cost-effectively, and the program review is a chance to harness the power of that collective action.

Stronger complementary components – The program review is also an important chance for the RGGI states to strengthen other pieces of the program to help it live up to its true potential. That includes dealing with the surplus of emission allowances now in the hands of power plant owners; making sure that the protection against too-high prices is truly used for emergencies, not little price bumps; taking advantage of low allowance prices to move ahead more quickly; and making sure allowance prices don’t go too low. (See here for more details about each of these opportunities.)

Potential expansion – The successful RGGI framework could be expanded to other states (New Jersey, you’re welcome back anytime), or to other sectors such as transportation.

Stronger commitment to environmental justice – Last, but far from least, the program review offers an important opportunity not just to strengthen the program, but to ensure that RGGI’s benefits reach those who need them most—through consultation and through smart allocation of allowance revenues, for example. As our coalition letter says:

…the RGGI states must ensure that communities on the frontlines of the impacts of pollution and climate change have a say in how RGGI is implemented and how funds are distributed to ensure broad and equal opportunities to experience RGGI benefits. Strengthening the RGGI program in communities that bear the biggest burden of pollution is critical.

Bolder, stronger, further, fairer

The decision makers in the nine RGGI states have a golden opportunity to show real leadership on climate and energy, to put in place the stronger policies and targets that this time in history demands, and to seize all the rewards that come from boldness in climate action.

It’s up to you, RGGI leaders. Seize the day.

Remembering Herb Needleman—The Hero Who Got Lead Out of Gasoline

UCS Blog - The Equation (text only) -

Dr. Herb Needleman, a Pittsburgh pediatrician whose pioneering research into the toxic effects of lead on children led to the removal of lead from gasoline and other products, died last week at the age of 89. He was a tireless advocate for children’s health in the face of persistent attacks on his work and integrity from the lead industry. A decade ago, he showed up in my life in a pretty unexpected way.

In 2006, UCS  brought scientists from to Washington, DC to talk to legislators about the manipulation and suppression of science, and the consequences that has for public health and the environment. We recruited scientists from our Sound Science Initiative, the precursor to the UCS Science Network, and from the list of experts who had signed the scientist statement on scientific integrity that called on the Bush administration to restore scientific integrity to federal policymaking.

Dr. Needleman’s research transformed our understanding of the impact of lead on children’s developing brains, and led to the removal of lead from gasoline. Photo: NIEHS

Dr. Needleman was on that list of signers. Yet we clearly hadn’t done our research. We had no idea who he was. I only knew him as a pediatrician with a low voice and large glasses.

There was a lot behind those glasses. Others have written extensively about his experiences, including in The Lead Wars and this extensive interview, better than I ever could. But briefly, Dr. Needleman conducted extremely novel research in which he collected and measured lead levels in children’s primary teeth, demonstrating a correlation between intellectual development and exposure to lead, even at low levels.

A 1979 study he published in the New England Journal of Medicine transformed the field, set the stage for restrictions on lead in gasoline, and put a target on his back.

He should have been celebrated. He was not. “He was attacked by the lead industry, hounded by columnists, snooped after by hired investigators, had his files endlessly combed over by high priced consultants, and was indifferently supported by many of his colleagues at his university,” remembered Richard Jackson, former director of the CDC’s National Center for Environmental Health.  

For years, the lead industry and affiliated individuals did their best to tarnish his reputation and take him down through unfounded allegations of scientific misconduct. At times, he could not even depend on his university to play fair. Via The Lead Wars:

The attacks on Needleman’s research and on his scientific integrity culminated in 1991 when Claire Ernhart and Sandra Scarr filed charged with the Office of Scientific Integrity at the National Institutes of Health, alleging that Needleman had engaged in scientific misconduct…They demanded, and received, another inquiry, which led to Needleman’s own university, the University of Pittsburgh, to begin another investigation of his research. It was a ‘horrible’ period in his life, Needleman recalls. The university refused to allow him to bring in outside experts, though it called on others who had previous professional relationships with his accusers. The university initially refuse to open the hearings to the public; it took a petition campaign from scientists around the country to persuade the campus officials otherwise. He was shunned by colleagues who worried about being associated with him. But in 1992, despite some methodological criticisms, Needleman was vindicated of wrongdoing by the university and, similarly, three years later by the office of Research Integrity of the Department of Health and Human Services.”

Dr. Needleman, ever determined and courageous, emerged from these battles with his professional reputation intact. And eventually, he ended up in a UCS conference room. Here was a man who had testified before Congress, been attacked repeatedly in multiple venues, and seen the way that science is politicized at the expense of children. A titan. He could have trained us. Yet he came to Washington DC and patiently listened to our basic explanations of how the Bush administration was manipulating and suppressing science, and how we needed to educate Congress about the importance of providing oversight.

Dr. Needleman at a 1991 congressional hearing. Photo: C-SPAN

It turned out that I was assigned to accompany Dr. Needleman and other scientists to meet with lawmakers. In each meeting, when it was his turn, he spoke softly but firmly about how as a doctor he relied on the government to provide guidance about the safety and efficacy of drugs. He talked about the importance of protecting children from harmful contaminants, and about how he had witnessed the consequences of politics getting in the way of protecting children from lead poisoning in his earlier years.

Not once did he discuss his pivotal role in keeping millions of children—including me—safe. He didn’t want to take over the show. He wanted the story to be about public health and the environment.

It was months later when I came across his name while doing some research. The more I read, the more foolish I felt for not recognizing the legend among us. Yet I also felt honored to have had the opportunity to cross paths. And I felt thankful that such an accomplished man was still committed to fighting for what was right.

I often refer to Dr. Needleman in talks to help make the point that political and industry pressure on scientists is not new, and that many have pushed alternative facts for years to further narrow agendas at the public’s expense. I hope that people will continue to study his life and remember that we can persevere in the face of extremely powerful interests to make the lives of others better.

Six Months into the Trump Administration: Science and Public Health Under Siege

UCS Blog - The Equation (text only) -

My colleagues at the Union of Concerned Scientists (UCS) have released a report on how science—and public health—have been sidelined during the first six months of the Trump administration. The report documents a deliberate and familiar set of strategies that undermines the role of science, facts, and evidence in public policy and decision-making.

From a public health perspective, the short- and long-term impacts are truly frightening. The Trump administration—aided and abetted by a willing Congress—is actively pushing an ideological, anti-science agenda that will profoundly affect the health, safety, and security of children, families, and communities today, tomorrow, and for decades to come.

They claim their approach is pro-business, but on closer look, that isn’t true. It harms the many good business people who want to play by the rules and make a profit without harming the public or their workers. How? By giving an unfair advantage to unscrupulous businesses that will put profit ahead of public and worker safety and health.

Control, Alt, Delay: Public health protections on the chopping block

Mercury, lead, arsenic, ozone, beryllium, silica, chlorpyrifos. These substances all have several things in common:

  1. They have all been found to contaminate our air, water, soil, and/or food, as well as some of our workplaces and community environments.
  2. Robust and often long-standing science has proven that exposure to them can cause serious health effects, including death.
  3. Government agencies charged with protecting our health and safety have established rules and standards to prevent or minimize our exposure to them. (Note: these and other public health safeguards are increasingly denigrated as unnecessary regulations by the Trump administration and some in Congress.)
  4. Exposure standards established years ago have been found to be insufficiently protective.
  5. The Trump administration has taken steps to weaken, delay, and subvert recent science-based safeguards that enhance public protection from these toxic substances.

Make no mistake. There is an all-out assault on the agencies charged with using independent, unconflicted science to protect our nation’s public health—and on the critical resources and infrastructure they need to do just that.

The proposed draconian cuts to budgets, staffing levels, and programs at agencies like the EPA, CDC, FEMA,  NOAA, USDA, and OSHA speak for themselves. (And don’t even get me started on how current congressional efforts to reform health care will impact the health of our most vulnerable populations.)

But the real issue isn’t about protecting agency budgets or staff levels, essential as they are. It’s about protecting all of us from known (and emerging and future) threats to our health, safety, and well-being. What follows is just a snapshot of this administration’s siege on public health.

Children and women first? Not so much

Informed by a wealth of scientific evidence, and putting the health of children first, the EPA banned the indoor use of the pesticide chlorpyrifos back in 2000.

Chlorpyrifos, a potent neurotoxin, is known to affect brain development and cause developmental delays in children exposed in their homes, through their diets, and through their mothers in utero. It has also been shown to sicken farmworkers who apply it, and to contaminate drinking water supplies in farming communities.

In 2015, the EPA announced that it would revoke all tolerances for its use on or in food—essentially banning its use—noting that it was unable to find a safe level of exposure. On March 29, 2017, less than three months into the Trump administration and 20 days after meeting with the CEO of Dow Chemical, EPA Administrator Scott Pruitt rejected the advice of his  agency’s own chemical safety experts and reversed this decision—ironically noting that “By reversing the previous administration’s steps to ban one of the most widely used pesticides in the world, we are returning to using sound science in decision-making—rather than predetermined results.”

The American Academy of Pediatrics denounced the decision. In its June 27, 2017, letter to Mr. Pruitt, the medical association called the risk of chlorpyrifos to infant and children’s health and development “unambiguous” and urged the EPA to listen to its own scientists and go forward with the proposed rule to end its uses on food.

The next time EPA is required to review the safety of the chemical is five years out—2022 to be exact. Until then, children will be eating peaches, pears, broccoli, and other foods grown with chlorpyrifos, as will pregnant women who are unknowingly putting their babies at risk.  Farmworkers and their families will also continue to be exposed—but hey, Dow Chemical, and the smaller manufacturers, will have “regulatory certainty.”

A West Virginia Coal Miner sprays rock dust

A West Virginia coal miner sprays rock dust 900 feet underground. OSHA has estimated that about 2.3 million workers are exposed to respirable crystalline silica. Photo: courtesy of NIOSH

Protecting our nation’s workforce: Not in the cards

The assault on worker health is one that has particular meaning to me, having spent decades of my life specifically focused on occupational health and safety. And when I think about how long (decades) it takes to promulgate standards to protect worker health—even when their dangers have been known for eons—I am truly astounded by what the current administration and Congress have done.

Take silica. Its devastating health effects have been known for centuries. As far back as 1556, in his Treatise on Mining, Agricola described a pulmonary disease afflicting stone cutters and miners.  Bernardino Ramazzini, known as the father of occupational medicine, wrote about respiratory symptoms and sand-like substances in the lungs of stone cutters in his 1700 seminal work De Morbis Artificum Diatriba (Diseases of Workers). The 1936 Hawks Nest Tunnel Disaster at Gauley Bridge in West Virginia—one of the worst industrial disasters in US history—is as harrowing a tale of occupational exposure to silica as one would ever want to read about.

Silica causes lung cancer and silicosis, a disabling, non-reversible, and sometimes fatal lung disease, which can develop or progress even after exposure has ceased. OSHA has estimated that about 2.3 million workers are exposed to respirable crystalline silica, including 2 million construction workers who drill, cut, crush, or grind silica-containing materials such as concrete and stone, and 300,000 workers in general industry operations such as brick manufacturing, foundries, and hydraulic fracturing. Like most occupational illnesses and injuries, silicosis is preventable.

In keeping with scientific evidence, in 2011 OSHA sent a proposed tightening of its 40-year old silica standard to the Office of Management and Budget. In 2013, OMB gave OSHA the green light to actually propose the new rule, which OSHA promulgated as a final rule in March 2016, with an effective date of June 23, 2016. Industries were given different timelines to comply with most requirements—one year for construction, two years for general industry, and five years for hydraulic fracturing. In its fact sheet on the final rule, OSHA noted that “Many employers are already implementing the necessary measures to protect their workers from silica exposure. The technology for most employers to meet the new standards is widely available and affordable.”

So call me gobsmacked when in April 2017 OSHA decided to delay enforcement in the construction industry by another three months. That may not sound like a lot, but tell that to construction workers who may already have been breathing dangerous levels of silica dust for years. And call me crazy, but my confidence in OSHA sticking with even that schedule is somewhat shaken.

You’ll see why when you see how the administration has turned a two-month delay in implementing its new protective standards for workers exposed to beryllium into a proposal to “modify” (read “weaken”) protections for workers exposed to beryllium in construction and shipyards. I won’t rehash it here, as I’ve already covered it here and here.

The CRA could be used to undo important public health protections that are vital to protecting the most vulnerable populations.

The scientific evidence that ground-level ozone has serious health impacts is beyond dispute.

The EPA gives states a breather on ozone. People will suffer the consequences.

The EPA has been regulating ozone as a criteria pollutant since National Ambient Air Quality Standards were established in the Clean Air Act in 1970—and with considerable success.

Levels of ground-level ozone, the main component of in smog, have declined over time. Though many people still live in areas with unhealthy levels of ozone pollution, the decline is evidence that air pollution requirements are working.

Good thing, as the scientific evidence that ground-level ozone has serious health impacts is beyond dispute. It increases the frequency of asthma attacks, can cause chronic obstructive pulmonary disease (COPD), and worsens bronchitis and emphysema. It can increase risk of lung infections. And it has been associated with early deaths from cardiovascular disease. Children, the elderly, and those with respiratory and cardiovascular disease are especially vulnerable.

For some current context:

  • The American Lung Association reports that more than one-third (36 percent) of the people in the United States live in areas with unhealthy levels of ozone pollution. Approximately 116.5 million people live in 161 counties that earned an F for ozone in this year’s report.
  • CDC national and state surveillance systems estimate that 8.4% of children under the age of 18 have asthma and that 4.7% of children between 0 and 4 years of age currently have asthma. And the number of reported missed school days among children with asthma was 12.4 million in 2003, 10.4 million in 2008, and 13.8 million in 2013.

Nearly five years ago, in 2013, scientists on the agency’s Clean Air Scientific Advisory Board recommended that the EPA tighten its ozone standard, which was 75 parts per billion (ppb) at the time.  The experts recommended a range of 60-70 ppb, while noting that the upper level is likely not to provide the adequate margin of safety required by the Clean Air Act; that is, a 70 ppb standard was likely not protective enough of public health.

The politics involved in lowering the standard were fraught (no surprise here), with courts eventually weighing in. After years of foot dragging, the EPA issued a final rule (with a 70 ppb standard) in late 2015. In June 2017 the EPA decided to give the states another year to comply with the long-awaited standard, noting the increased regulatory burden and increased costs to business. But it’s not like states and businesses have not seen this coming. It’s been years in the making. Indeed, the EPA’s Clean Air Scientific Advisory Board first recommended a range 60-70 ppb back in 2007!

In the meantime, it’s people that will take the hit. Too many kids and others will continue to break out their inhalers, visit emergency rooms, and lose time at school and at work. The delay may work for some interests, but certainly not for public health.

And the latest: On July 18, the House of Representatives approved a bill that would delay enforcement of the EPA ozone standard until the middle of the next decade, giving companies an additional 100year reprieve on complying with the new ground-level ozone (a.k.a. smog) health standards. The bill also permanently alters the Clean Air Act’s timetable for updating air quality safeguards. It will likely face opposition in the Senate, but is just the latest signal that our air quality and public protections are under attack. (A similar provision has also been included in the House Interior spending bill as an ideological rider.)

Heavy metal:  Nothing to sing about

The serious and permanent consequences of lead exposure on children’s brains, development, and behavior have been known for decades. Its effects cannot be corrected; they will affect kids’ lives forever. Photo credit: CDC

It doesn’t take a toxicologist to know that heavy metals—like mercury and lead, and metalloids like arsenic—are not good for your health. Their ill effects have been known for hundreds of years.

In the 1800s, mercury exposure caused mad hatter’s disease in workers making felt hats. Industrial waste water containing methyl mercury was the source of two devastating outbreaks of neurological disease in Minimata, Japan in the 1950s and 1960s. And today, it’s common knowledge that eating fish and shell fish from mercury-contaminated waters pose dangerous risks to the developing fetus.

When it comes to lead, its serious and permanent effects on children’s brains, development, and behavior have been known for decades; even low levels have been shown to affect IQ, ability to concentrate and pay attention, language and communication fluency, and academic achievement.  And effects of lead exposure cannot be corrected; they will affect kids’ lives forever. (Lead toxicity has been known since ancient times; it was described by the Greek physician Nicander as far back as the second century B.C.)

Government regulations eliminated lead in household paint in 1978 and in gasoline in the late 1980s. The ongoing crisis in Flint brings lead exposure to the present day. The CDC estimated that more than a million US children had lead poisoning when, in keeping with the science, it lowered its definition of poisoning to 5 micrograms of lead per deciliter of blood in 2012.

Federal and state agencies provide lead screening, surveillance, and prevention programs, and continue to identify lead contamination in homes and children with unacceptable levels of lead in their blood. A quick web search will identify sources of state and even local data.  See, for example, here, here and here.

And about that metalloid (and human carcinogen) arsenic. In establishing its enforceable maximum contaminant level for arsenic in drinking water in 2001, the EPA reported that in the US, approximately 13 million individuals lived in areas with a concentration of inorganic arsenic in the public water supply that exceeds its concentration limit of 10 micrograms per liter (μg/L ).

You would think the established science about the hazards, and the current knowledge about populations at risk, would have the Trump administration actively pursuing public health protections in its desire to make America great again. Instead, we are seeing delays, repeals, and rollbacks.

In November 2016, Obama’s EPA issued a final regulation under the Clean Water Act to limit the amount of these and other toxic metals that power plants can release into public waterways. The EPA noted that due to their close proximity to these waterways and the relatively high consumption of fish, some minority and low-income communities face greater risk. reports that, according to the Water Keeper Alliance, nearly 35% of coal plants discharge toxic pollution within five miles of a downstream community’s drinking water intake and that 81% of coal plants discharge within miles of a public drinking water well.

Flying in the face of solid scientific evidence on the need to protect the public and public waterways from these toxic pollutants and siding with the polluting industry, in April 2017 Trump’s EPA announced it would delay and reconsider this regulation. Mr. Pruitt actually said that the delay would be in the public interest. I’m still trying to get my head around that one.

In a related but somewhat earlier action, Congress beat the EPA to the punch in striking a blow related to these toxic materials. In February, using the Congressional Review Act, our elected representatives repealed the 2016 Department of Interior Stream Protection Rule that would safeguard streams and provide communities with basic information about water pollution related to mountain top removal coal mining.

Read enough?

Sorry that this “snapshot” of a blog has turned into quite a large collection of worrisome examples. Our new report covers even more Trump administration actions that will impact public and worker health—and not in a good way. It also suggests steps we and others can take to hold the administration accountable when it prioritizes public health over bad actors in the private sector.

Most important is that we not lose hope or become inured to or exhausted by what is likely to be an ongoing assault on science, public health, and some of the fundamental elements of our democracy. It will be important to call out and speak out against such actions (and inactions) when you see them.We will do our best to track them, but we also need and value your help. So when you see something, let us know.

The Union of Concerned Scientists is in it for the long haul.

Photo: Petra Bensted/CC BY (Flickr)


Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs