UCS Blog - The Equation (text only)

Flooded by Hurricane Harvey: New Map Shows Energy, Industrial, and Superfund Sites

A new UCS analysis shows that more than 650 energy and industrial facilities may have been exposed to Hurricane Harvey’s floodwaters.

Harvey’s unprecedented levels of rainfall in Texas and Louisiana coasts have exacted a huge toll on the region’s residents. In the weeks and months ahead, it is not only homes that need to be assessed for flood damage and repaired, but also hundreds of facilities integral to the region’s economy and infrastructure.

To highlight these facilities, the Union of Concerned Scientists has developed an interactive tool showing affected sites. The tool relies on satellite data analyzed by the Dartmouth Flood Observatory to map the extent of Harvey’s floodwaters, and facility-level data from the US Energy Information Administration and the Environmental Protection Agency.

The tool includes several types of energy infrastructure (refineries, LNG import/export and petroleum product terminals, power plants, and natural gas processing plants), as well as wastewater treatment plants and three types of chemical facilities identified by the EPA (Toxic Release Inventory sites, Risk Management Plan sites, and Superfund sites).

Chemical facilities potentially exposed to flooding

Hurricane Harvey may have exposed to flooding more than 160 of EPA’s Toxic Release Inventory sites, 7 Superfund sites, and 30 facilities registered with EPA’s Risk Management Program.

The Gulf Coast is home to a vast chemical industry. The EPA’s Toxic Release Inventory (TRI) program lists over 4,500 facilities in Texas and Louisiana alone that are required to report chemical releases to the environment.

Before the storm hit, many facilities shut down preemptively, releasing toxic chemicals in the process. In the wake of the storm, explosions at Arkema’s Crosby facility highlighted the risks that flooding and power failures pose to the region’s chemical facilities and, by extension, the health of the surrounding population.

In the Houston area, low-income communities and communities of color are disproportionately exposed to toxic chemicals. Our analysis shows that over 160 TRI facilities, 7 Superfund sites, and over 30 facilities registered with EPA’s Risk Management Program were potentially exposed to floodwaters. Though most of the impacts from this exposure remain unknown, the risks include compromised facilities and the release of toxins into the air and receding floodwaters.

Energy infrastructure

In the week since Hurricane Harvey reached the Texas coast, disruptions to the region’s energy infrastructure have caused gas prices to rise nationally by more than 20 percent.

Our analysis finds that more than 40 energy facilities may have been exposed to flooding, potentially contributing to the fluctuations in gas prices around the country. As of yesterday, the EIA reports that several refineries have resumed operations while others are operating at reduced capacity.

More than 40 energy facilities–including power plants and refineries–may have been exposed to Hurricane Harvey’s floodwaters.

Wastewater treatment

Wastewater treatment facilities comprise the bulk of the facilities (nearly 430) that we identify as potentially exposed to flooding. The EPA is monitoring the quality and functionality of water systems throughout the region and reported that more than half of the wastewater treatment plants in the area were fully operational as of September 3.

With floodwaters widely reported as being contaminated with toxic chemicals and potent bacteria, wastewater treatment facilities are likely contending with both facility-level flooding and a heightened need to ensure the potability of treated water.

Nearly 430 wastewater treatment facilities may have been exposed to flooding during Hurricane Harvey.

About the data

It is important to note that the satellite data showing flood extent is still being updated by the Dartmouth Flood Observatory, and that we will continue to get a better handle on the extent and depth of flooding as additional data become available from sources such as high water marks from the USGS.

As of Tuesday, DFO Director Robert Brakenridge stated in an email that they believe the data to be fairly complete, including for the Houston area, at a spatial resolution of 10 meters. Given uncertainties in the flood mapping as well as in the exact locations of each facility, it is possible that this map over- or underestimates the number of affected facilities. It is also possible that facilities, while in the flooded area, were protected from and unaffected by floodwaters.

Make Public Engagement a Professional Priority

During graduate school, I believed my responsibility as a scientist during outreach events was to share my work with as many non-scientists as possible. I assumed that my extroverted personality, boundless enthusiasm, and booming voice guaranteed my success at public outreach. I never considered improving or diversifying my communication skills, nor did I value the unique perspective that I might bring to science.

Like so many others, it wasn’t until the November 2016 election that I considered how I, the daughter of Indian immigrants from landlocked villages and modest means, came to study oceans and climate change. From this foundation, I gradually developed and now execute two public engagement aims that often intersect:

1. How the observations I make in the lab and field percolate into the communities around me.

2. The concerns facing marginalized communities, especially within science.

These efforts do not always take the same form, nor are they easy to pursue—certain issues can be especially difficult to write about—but I see that sharing painful stories about minority scientists increases the scientific community’s capacity for empathy, and communicating stories of innovation and progress in the battle against climate change imbues optimism and facilitates action.

Outside of my current position as a technician at UC Davis’ Bodega Marine Laboratory, I work with a local organization dedicated to raising awareness about climate change and a national organization committed to talking about the issues confronting self-identifying women scientists. I also serve on the digital advisory board of a regional publication that is seeking to add diverse voices to conversations about natural science.

Public engagement is a scientist’s implicit responsibility and can be beneficial for the public and scientist alike

Public engagement is often seen as a low priority for academic scientists. Many scientists do not feel compelled to take their research outside of academia. Common justifications include that developing resources for public engagement siphons time and energy from research, misrepresentation in the media could damage reputations, or institutions lack incentives for engagement. While these concerns are understandable, reserving our findings for our colleagues limits the impact of our work.

As scientists, we strive for intellectual products that improve and enhance our understanding of the world around us. Tools for effectively communicating to technical and lay audiences are not in opposition, nor are they as disparate as many may think; thoughtful, clear, and succinct communication tools are ubiquitously useful. By carefully considering audiences beyond our target journals and scientific societies, we create opportunities to develop unique collaborations that can result in the co-production of knowledge.

Effective public engagement is manifold, but requires experimentation

In this era of technology and social media, successful public engagement does not necessarily require face time (although you can use FaceTime or Skype A Scientist). Public outreach often encompasses classroom visits, laboratory open house events, and public talks/demonstrations. While personal interactions are inarguably priceless, these activities are generally eschewed in favor of research due to their high time commitment. This is where digital media can intervene.

During the era of MySpace, Friendster, and LiveJournal the concept of ‘blogging’ emerged—an opportunity for anyone with an opinion and keyboard to share their opinions. While these ancestral social media sites have faded, blogging has been transformed into an opportunity to use our voices (and fingers!) to reach new audiences. Websites like Medium and WordPress make blogging accessible, and many website building/hosting services seamlessly integrate blogging into their schemata. The time commitment is dictated by the blogger and the topics that they choose to communicate. Many academics will admit to initiating and abandoning their blogs for this very reason, myself included.

Conversely, Facebook, Twitter, and Instagram—among many, many others—provide approachable, yet professional interfaces for casual and concise communication. While a short orientation may be required to acquaint yourself with these platforms, their rewards are bountiful. Through Twitter alone, my professional network has expanded geographically as well as across disciplines and industries (a Twitter interaction instigated this very blog post!). While I maintain a blog series with pie-in-the-sky long-term goals, I find that ephemeral, short-term social media interactions can sometimes be more professionally productive per unit of effort and therefore serve as an excellent gateway into public engagement.

Identify what motivates you to speak up and connect with your community

The November 2016 election was my catalyst for public engagement, but has not been my sole motivator going forward. Specifically, blogging has been an incredible learning experience for me, providing insight on the complexity of people, and the pressure that academia puts on those who don’t conform to its rigid framework.

Public engagement is not a part of my formal job description, but it is something that I make time for outside of my 40-hour work week. As scientists, we are driven by questions and certainly find our own work compelling. But we must unravel these complex questions and stories and find the thread that links us with our communities.


Priya Shukla is an ocean and climate scientist with the Bodega Ocean Acidification Research (BOAR) group based at UC Davis’ Bodega Marine Laboratory. She received her undergraduate degree in Environmental Science and Management at UC Davis and earned her Master’s in Ecology from San Diego State University. Priya uses science communication to bridge issues concerning social justice, rapid environmental change, and the scientific community. 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Why Congress Should Put the “Nutrition” Back in Nutrition Assistance

Photo: USDA

Despite messages of economic populism, the Trump administration and its Congressional enablers have not been kind to the millions of Americans who struggle to make ends meet. From attacks on affordable health insurance and a living wage to tax cuts for the wealthy and worker protection rollbacks, they’ve made clear where their allegiance lies.

Now, the nation’s leading food assistance program for low-income individuals and families is on the chopping block. As with so many other policy proposals, that would not just be cruel but also short-sighted, new research suggests.

The Supplemental Nutrition Assistance Program (SNAP) program is an effective response to poverty and food insecurity, lifting an estimated 4.7 million people out of poverty in 2014—including 2.1 million children—and even stimulating the economy during our most recent economic downturn. Still, the White House and some House Republicans appear eager to cut benefits and enact new (but largely unnecessary) work requirements.

In response, a new study published today in the Journal of Nutrition Education and Behavior shows that rather than cutting the SNAP program, Congress would be wise to increase its investment to better promote healthy eating among recipients. That’s because the study’s authors found that current benefit levels fall short of supporting a healthy diet, including the recommended daily intake of fruits and vegetables. And that’s not just bad for SNAP recipients, but for all of us, as it leads to greater costs from preventable diet-related diseases down the line.

Updating the costs of a MyPlate diet

The authors (full disclosure: they’re UCS senior economist Kranti Mulik and former UCS health analyst Lindsey Haynes-Maslow, now an assistant professor at North Carolina State University) sought to fill an important knowledge gap, informing policy makers of the true cost of healthy eating for individuals and families today. In 2011, the US Department of Agriculture (USDA) calculated the cost of various eating plans based on its “food pyramid” (the federal dietary guidelines before 2010). The USDA has used its resulting “Thrifty Food Plan” to determine SNAP benefit levels ever since, but it’s now out of date.

The present study is an important update in two ways. First, it calculates the cost of following today’s federal Dietary Guidelines for Americans, represented visually by the USDA’s MyPlate graphic, in which half of a person’s daily “plate” consists of fruits and vegetables. And second, Mulik and Haynes-Maslow considered the cost of labor to prepare food, an important but previously overlooked consideration.

Using the most current retail price data available from the USDA, Mulik and Haynes-Maslow documented the full monthly cost of following MyPlate, creating several scenarios in which individuals and families could meet that guideline with fresh, frozen, and/or canned produce. Then, they compared the cost of the various healthful eating scenarios to current SNAP monthly benefit levels.

The upshot? The benefits don’t even come close to covering the costs.

Of course, the very name of the program—the Supplemental Nutrition Assistance Program—indicates that it isn’t meant to fully cover recipients’ monthly food budgets. The study design took that into account, assuming a “benefit reduction rate” of 20 percent—the percentage of food costs that SNAP participants pay for themselves, according to previous research.

So how much additional SNAP support would struggling families need in order to eat a consistently nutritious diet? The authors found that a hypothetical household (two adults, one child 8-11 years old, and another child 12-17 years old) would need to incur an additional cost of $627 per month to eat a healthy diet in accordance with MyPlate.

This is a significant shortfall. And it’s an important finding, because researchers who study SNAP already know that recipients’ monthly benefits frequently run out before the end of each month. In a UCS policy brief published earlier this year, we noted: “Data indicate that household food bills frequently exceed the USDA Thrifty Food Plan standard costs used to determine benefit amounts, which may reflect inaccurate assumptions about geographic price variation, food preparation time, households’ ability to access food outlets, and the percentage of household income spent on food.”

The “N” is for “nutrition”

SNAP is intended to do more than just feed people. While many Americans fail the healthy eating test—fewer than 1 in 10 Americans meets recommendations for fruit and vegetable intake—it can be particularly difficult for low-income households, which not only lack financial resources, but also face more barriers to accessing healthy foods. And although half of all Americans now live with a diet-related chronic disease, the burden of poor health disproportionately affects low-income populations and communities of color.

If SNAP is truly to be a “nutrition” program, it should do more to facilitate good nutrition for participants and their families.

Raising SNAP benefits would be good for us all…and voters support it

Healthier eating would deliver significant benefits for that population—less obesity and diet-related illness, and fewer lost work and school days. But it would also come with a payoff for the nation’s health broadly and for taxpayer-funded healthcare programs, including Medicare and Medicaid.

A 2013 UCS analysis found that increasing Americans’ consumption of fruits and vegetables could save more than 100,000 lives and $17 billion in health care costs from cardiovascular disease (CVD) each year. And a recent study from researchers at Tufts University and colleagues in the UK found that a 30 percent fruit and vegetable subsidy targeting SNAP recipients would avert more than 35,000 CVD deaths by 2030, and would reduce disparities in CVD rates between SNAP recipients and the general population.

Moreover, a recent survey of more than 7,000 American voters conducted by researchers at the University of Maryland found that large bipartisan majorities (78-81 percent) supported substantial increases SNAP benefits, while 9 out of 10 (including 8 in 10 Republicans) favored providing discounts on fruit and vegetables bought with SNAP benefits. (Respondents also agreed with proposals to restrict the use of SNAP benefits to purchase sugary foods and beverages.)

So while the White House and members of Congress seek to balance budgets on the backs of the most vulnerable among us, their constituents support policies that make it easier for low-income Americans to eat a healthy diet. With the next five-year Farm Bill putting the question of SNAP funding back on the table, Congress and the White House should do just that.


Photo: USDA

Superfund Sites and the Floods of Hurricane Harvey: Foreseeable or an “Act of God”?

Photo: Patrick Bloodgood/US Army

Superfund sites contain some of the most dangerous chemicals known to humankind. It has been confirmed that Superfund sites in the Houston area were submerged by the floodwaters of Hurricane Harvey. Does this mean these hazardous chemicals were swept away off of Superfund sites into neighboring communities where people live, play, and work? If so, who will be responsible for cleaning up such a disaster?

There are approximately a dozen Superfund sites in the Houston area that could have been compromised by floodwaters. The Associated Press surveyed seven of these sites and found signs of inundation, and the Environmental Protection Agency (EPA) has reported that 13 of 41 Superfund sites in the area were flooded, using aerial images. Nancy Loeb, director of the Environmental Advocacy Center at Northwestern University’s Pritzker School of Law, cautioned that “If the water picks up contaminated sediment from [Superfund] sites, that may get deposited in areas where people frequent—residential properties, parks, ballfields—that were never contaminated before. We can’t say for sure it will happen, but it’s certainly a possibility.” Residents of the area are rightfully concerned about being exposed to these dangerous toxins potentially leaking from Superfund sites.

Superfund sites in Houston are vulnerable to flooding

Extreme temperatures, sea level rise, decreased permafrost, increased heavy precipitation events, increased flood risks, increased frequency and intensity of wildfires, and increased intensity of hurricanes—all of these impacts are expected under climate change, and the EPA’s Office of Land and Emergency Management (OLEM; formerly the Office of Solid Waste and Emergency Management) considers them threats that make their programs vulnerable, as stated in their climate change adaptation implementation plan. The Superfund program, which ensures that sites littered with hazardous chemicals (often cancer-causing) get cleaned up, may be particularly vulnerable to these climate change impacts.

Photo: SenseiAlan/CC BY 2.0 (Flickr)

In a national-level vulnerability analysis, the EPA found that Superfund sites are particularly vulnerable to climate change related flooding and chronic inundation, impacts that would likely result in the “loss of remedy functionality and effectiveness indefinitely” with the possibility of hazardous chemicals being released from the site. We know that Superfund sites have flooded and overflowed previously, during Hurricane Katrina and Post-Tropical Cyclone Sandy. In 2011, the floodwaters of Hurricane Irene led to release of benzene (a highly carcinogenic substance) beyond the protective barriers of the American Cyanamid Superfund site located in New Jersey. It is therefore no surprise that the EPA took action to reinforce the infrastructure of the site to withstand similar flood heights in the future.

Of course, impacts related to climate change are not solely to blame for the devastating flooding seen in Houston. The Houston area is relatively flat and only about 50 feet above sea level, with about a 4-foot difference between the highest and lowest elevations downtown. This means that when rain falls, that water takes a long time to drain out of the city. Houston’s urban sprawl also is partly to blame—the replacement of wetlands with impermeable pavement means there is less land to soak up water after a rainfall event. Additionally, much of the infrastructure in Houston is only built to withstand flood heights associated with a 100-year flood, and the rains of Hurricane Harvey resulted in flood heights associated with a 1000-year flood! And then there is Houston’s location right outside of the Gulf of Mexico, which puts it in the path of slow-moving storms that can dump a ton of rainfall on the city. All of these factors make Houston a flood-prone area.

Who’s responsible if extreme events, made more likely by climate change, result in the release of hazardous chemicals from Superfund sites that harm public health?  

The threats of climate change could alter the liability of responsible parties under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), the federal hazardous waste law under which the Superfund program is regulated. CERCLA provides three defenses to strict liability for releases of hazardous substances. The potentially responsible party (PRP) must prove that the release was “caused solely” by (1) an act of God, (2) an act of war, or (3) an act of a third party.

Environmental attorneys expect that the “act of God” defense will be used more often as natural disasters become more common and severe with climate change. Under CERCLA, an “act of God” is defined as, “an unanticipated grave natural disaster or other natural phenomenon of an exceptional, inevitable, and irresistible character, the effects of which could not have been prevented or avoided by the exercise of due care or foresight.”

Were the catastrophic floods of Hurricane Harvey an “act of God?”

Some types of extreme events are more likely to occur with climate change, including heavier precipitation events (see Fig 1-08) that result in flooding. Additionally, it is expected that hurricanes will become more intense as a result of climate change. As my colleague Brenda Ekwurzel points out, global hurricane models that compare the past three decades with further climate change (RCP 4.5) toward the end of the century show an increase of average hurricane intensity, precipitation rates, and the number and occurrence of days with intense category 4 and 5 storms. However, hurricanes that develop over the U.S. North Atlantic Ocean may not necessarily fall in line with this global trend. We also know that sea levels are rising, increasing coastal flood risks.

We will continue to understand more as extreme weather scientists investigate the relative contribution of climate change to the impacts of Hurricane Harvey (via attribution science), but it is important to note that we already know that extreme floods are more likely to occur under global climate change scenarios. In other words, extreme flood events are foreseeable (i.e., not an “act of God”). This is the reason why Executive Order 13690, which had bipartisan support, created more stringent federal flood risk management standards that took into consideration the link between climate change and floods to increase the resilience of federal infrastructure. President Trump rescinded this order only weeks before Hurricane Harvey hit.

What would the courts do if hazardous chemicals were found to escape a Superfund site, harming the people of Houston, in the devastating wake of Harvey?

Prior cases may give us an indication. A decision by the U.S. Court of Appeals for the Second Court in 2014, in which the “act of war” defense relieved World Trade Center owners and lessees of Superfund liability for toxic dust that infiltrated a building blocks away due to the 9-11 terrorist attacks, could be cited as a case for the proposition that hazardous chemical releases from Superfund sites during extreme events should be considered “acts of God.” This is because the court, in dicta, likened the attacks of a tornado (an extreme event) to an “act of God.” “It would be absurd to impose CERCLA liability on the owners of property that is demolished and dispersed by a tornado”, the court said. “A tornado, which scatters dust and all else, is the ‘sole cause’ of the environmental damage left in its wake.”

Professor Michael Gerrard, director of Columbia Law School’s Center for Climate Change Law said that comparisons between terrorist attacks, tornados, and climate change would be dubious, “A tornado or terrorist attack in a particular location is not foreseeable; not so with sea level rise and the associated coastal flooding.” Gerrard said that “whether or not the act of God or third-party defenses are available will depend in part on whether adequate precautions were taken against foreseeable risks.” So far, the “act of God” defense has been disfavored by courts when toxins have leaked from sites during extreme weather events—these decisions may bode well for future decisions on predictable climate change impacts to Superfund sites.

Regardless of who is or is not responsible for such harm, we hope that the EPA is working to make Superfund sites more resilient to the impacts of climate change, especially as extreme floods become more common in the future.

¿Estás en zona de peligro de inundaciones por Irma o José? Entérate con éste mapa y pon a ti y a los tuyos a salvo

Irma, September 5. Photo: US Naval Research Laboratory

No bien hemos pasado el susto de la destrucción causada por el Huracán Harvey en Houston, ya se nos encima Irma, y José parece que no tarda. Preocupado por el paso de éstos dos peligrosos fenómenos , me dí a la tarea de crear un mapa que puedes utilizar para evaluar el riesgo de que el lugar donde vives se inunde.

Pones tu dirección o pueblo y te mostrará el área en que vives, y si está dentro de un área clasificada por FEMA, (la agencia federal para el manejo de emergencias) en el  uno porciento de riesgo de un evento de inundación. La definición es un poco compleja, pero aquí hay un video que lo explica. También incluí una clasificación de la vulnerabilidad social y económica de la población de Puerto Rico en base al análisis de los Centers for Disease Control and Prevention (se llama Social Vulnerability Index). Las áreas con colores verdes más oscuros indican las zonas donde las características de la población (ingreso, número de personas en el hogar, los ancianos y/o discapacitados, etc.) los hacen más vulnerables a los peligros ambientales como los huracanes. Claro está, el estar tu hogar fuera de las zonas inundables no significa que debas bajar la guardia. Estos análisis son en base a datos históricos y puede que debido al cambio climático, el nivel de la precipitación de los huracanes recientes sean mucho más que las de eventos anteriores. Esta es la lección más inmediata que podemos aprender de Harvey, y hay que tomar estos fenómenos con mucho cuidado.

Accede al mapa: http://arcg.is/2f0gyGz

En adición, y en colaboración con el Center for Interdisciplinary Geospatial Information en Delta State University, hemos creado una serie de mapas útiles para asistir en tareas de búsqueda y rescate en situaciones de desastre.

Este mapa muestra el tipo de información relevante para identificar la localización precisa de una persona que necesite rescate o auxilio.

Estos mapas fueron creados en parte por mi mentor y colega Talbot Brooks, quien fue parte de un equipo que en 2005 identificó la necesidad de tener mapas con escalas cartográficas y sistemas de coordenadas en común, ya que lo existía al momento del Huracán Katrina en ese año eran mapas que dificultaban la coordinación de tareas de rescate entre múltiples agencias y cuerpos gubernamentales. Cualquier persona con un aparato móvil puede accesar la página https://usngapp.org/ para obtener la localización y comunicarla a un operador de 911, por ejemplo.

En base al peligro inminente que presenta Irma—y posiblemente José—le hacemos las siguientes recomendaciones al Gobernador Rosselló y reiteramos nuestro compromiso de colaborar con el gobierno del Estado Libre Asociado de Puerto Rico para crear herramientas que sean útiles durante y después de los huracanes:

  • Habilitar y/o establecer contactos con personal de FEMA o Department of Homeland Security que puedan utilizar los mapas USNG que estamos desarrollando. Estos mapas contribuyeron a salvar muchas vidas durante Katrina.
  • Hacer una petición formal de asistencia al “Stand-by Task Force” para que desplieguen la herramienta geospacial USAHIDI para facilitar a través del crowdsourcing operaciones de rescate en situaciones de desastre.

Connecting the Dots on Climate Science: The Importance of a Complete Science Narrative

A 2014 session of the International Panel on Climate Change (IPCC)—a crucial "dot" in a connected climate science narrative. Photo: IPCC (Flickr)

In Walter M. Miller’s classic apocalyptic novel, A Canticle for Leibowitz, an atomic holocaust leaves the world in a modern version of the Dark Ages. In this post-apocalyptic world, books are burnt and cultural information destroyed by anti-intellectual mobs. The monks of a small knowledge-hoarding religious institution try to preserve, understand, and control what information remains.

A few pre-apocalyptic scraps of paper are unearthed and the writing on these is transformed into holy artifacts. One of these holy artifacts is The Sacred Shopping List, a hand-written memo by a long-dead engineer named Leibowitz, that reads: “Pound pastrami, can kraut, six bagels—bring home for Emma.” With so little information available, The Sacred Shopping List becomes a pillar for the monks’ religious narrative.  The general population are just followers to the outlandish narrative the monks concocted from the holy artifacts.

A Canticle for Leibowitz depicts a classic problem that arises when people try to collect information to develop a narrative. The monks connected only the dots they could find, not having a clue that they had collected an incomplete and misleading set of dots. In science terms, the sample size was both small and biased, leading to their outlandish narrative.

In our own world, a functional science narrative requires a complete and well-chosen set of dots so scientists may connect them into a sensible narrative. Over the last few weeks, the Trump administration has eliminated several important dots (sources of information) about science, some openly and others more discreetly. These losses, when strung together, are significant –  they may drive us back to using our own Sacred Shopping List to develop our science narratives.

Climate science under threat

In 1989, President Reagan established the Federal Advisory Panel for the Sustained National Climate Assessment to help translate findings from the National Climate Assessment into concrete guidance for public- and private-sector officials making decisions about how to deal with climate change. This important National Oceanic Atmospheric Administration (NOAA) advisory group, a critical dot in this science narrative, was eliminated a few weeks back when the current administration chose not to renew the charter.

NOAA recently completed the first draft of the Fourth National Climate Assessment Report, which is intended as a special science section of the National Climate Assessment, congressionally mandated every four years. NOAA does not control how the information in the report will be used, and the loss of the Federal Advisory Panel leaves a vacuum in developing the guidance that should come from the Fourth National Climate Assessment Report. The draft report is under review by the current administration, which must approve it before the report can be published. It is critical that this important source of information be used in the science narrative about climate change.

The administration has also proposed a 2018 federal budget that zeroes out the United States’ nearly $2 million contribution to the Intergovernmental Panel on Climate Change (IPCC). The IPCC is crucial to coordinating efforts of several thousand scientists, industry experts, nonprofit researchers, and government representatives from across the globe who review reports that provide climate analysis for decisions ranging from the Paris climate agreement to the US military’s national security threat assessments. Eliminating funding for the IPCC would leave US scientists out of important scientific discussions and inhibit our country’s—and the world’s—ability to respond to climate threats.

Our congressional leaders will play a key role in determining whether IPCC funding will continue in the 2018 fiscal year. We need our lawmakers to uphold the United States’ climate leadership and commit to supporting funding for the IPCC. This is one dot we can’t afford to lose.

A path forward

One bright spot comes from our military, which continues to acknowledge the role of science in developing infrastructure that is resilient to the increasing national security risks from climate change. This is where the rubber meets the road. The US military is already dealing with threats from the growing numbers of refugees fleeing affected areas and numerous coastal military installations that will be impacted by climate change. They are using a useful set of dots to make their decisions.

We all benefit from these critical “dots” and need them to be connected into a sensible science narrative.  Many scientists like myself strongly support continuing funding for IPCC, bringing back the Federal Advisory Panel, and protecting federal climate science and research. It is important to use the knowledge of experts to evaluate and compare information and to be part of the plan for how the information is integrated into policy. The science narrative cannot rely on The Sacred Shopping List.


Keith Daum is an independent researcher who specializes in climate and environmental issues. In 2014, he retired from the Idaho National Laboratory after 24 years there. Previously, he served as a research scientist with RTI International. He holds a PhD in Chemistry from the University of Idaho and a BS/MS in Environmental Sciences from Texas Christian University.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

5 Things Congress Can Do to Help Communities Devastated by Hurricane Harvey

Photo: Texas National Guard/CC BY (Flickr)

As a proud Houstonian, with family living in Houston, I was heart-broken to see my home town devastated by hurricane Harvey and the apocalyptic flooding that will surely bring with it other serious health and safety problems long after the water recedes. The true cost of this disaster won’t be known for some time, but in the midst of this devastation we’ve seen Americans at their best, coming together, taking care of each other, loving each other, doing right by their fellow brothers and sisters; all of us Americans, but more importantly all of us human. Now it’s time for Congress to step up and follow that example.

People need help and they don’t care about political or fiscal ideological purity. They need money and other support services so they can begin to rebuild their lives, and prevent future storm events from laying waste to the city again. So what can Congress do to help Houston and other communities devastated by Hurricane Harvey when they come back from August recess?

  1. Pass Near-term Disaster Aid

The first thing they can do is pass a clean short-term disaster relief package (one without ideological amendments or special interest spending) while making sure to continue to work on more long-term funding and solutions long after this storm has dropped out of the headlines. And Congress should work to target aid to Texas and Louisiana’s most economically vulnerable populations.

At a time like this, the worst thing Congress can do is slow the process of disaster relief by quibbling over fiscal policy. It took several months to pass the $50.7 billion Hurricane Sandy aid package because of disagreements over certain types of spending in the bill. Most of the Texas congressional delegation put rigid political and fiscal ideology before their fellow Americans in need, and shamefully voted against disaster aid for Sandy victims. This time, we need Congress to pass an immediate bipartisan short-term relief package to help Houston until they accurately quantify the damage from this disaster and thoughtfully put together something for the longer term.

The truth is much of the Texas congressional delegation doesn’t deserve the support of the American taxpayer because of their votes on Sandy aid… but the people of Houston do.

  1. Extend and Reform Flood Insurance

Congress needs to finally do the tough work of tackling the vital but fiscally-challenged National Flood Insurance Program (NFIP), which provides affordable, government-backed flood insurance to roughly 5 million property owners. At present, the NFIP has elements that actually work as a disincentive to smart development/building practices or preparedness efforts because in many cases flood risk maps do not reflect true risks, and the cost of this insurance is artificially low, subsidized by the US taxpayer. While that may seem helpful to coastal property owners and folks living in flood plains, the reality is that it leaves them ill prepared for growing flood risks.

The NFIP must be reformed and strengthened to build resilience in flood-prone areas, including incentives for investments in measures that can help reduce flood risks for individual home owners and communities. We need to phase in increases to these insurance premiums so they come into line with the growing flood risks we are seeing from coastal erosion, sea-level rise, and extreme precipitation exacerbated by climate change. Any NFIP reform must also include affordability provisions to make sure that the increased cost of insurance doesn’t unfairly burden our most economically vulnerable communities.

If flood risk maps and insurance costs more accurately reflected flood risks based on the latest scientific information, property owners and developers would likely make wiser decisions regarding where they build, how they build, and how much they invest in infrastructure that supports disaster preparedness.

To accurately account for flood risk we also need to update FEMA’s antiquated flood risk maps using the latest science, and that means Congress must pass appropriations to fully fund the National Flood Mapping Program.

  1. Strengthen Flood Risk Standards

About a year and half ago President Obama issued an executive order to strengthen the Federal Flood Risk Management Standard (FFRMS), which mandated that federal agencies use more protective design standards when building or rebuilding in flood-prone areas. This was basically just common sense and potentially big savings for US taxpayers, whom foot the bill for federal construction (and repairs to federal infrastructure that communities depend on).

But on August 15th of this year, just about a week before Hurricane Harvey hit, President Trump issued his own executive order repealing the strengthened FFRMS citing over-burdensome regulations. With climate change helping to increase the ferocity of storm events, as well as flooding from rising sea levels and heavy precipitation, we must start to prioritize the implementation of strong standards and funding for disaster preparedness.

Implementing science-based building standards like FFRMS saves lives and money. Congress should take this out of hands of the executive branch by codifying the strengthened FFRMS into law. 

  1. Strengthen Chemical Safety Standards

The Arkema facility explosion, resulting from hurricane-caused power outages, is but one example of the harrowing consequences that ignoring vital updates to chemical safety regulations poses in real time. Chemical facilities like the Arkema facility in Crosby, Texas surround frontline communities and put them at an additional risk of toxic exposure. In addition, first responders, like the hospitalized deputies who entered the Arkema facility, are at particular risk when these facilities fail to coordinate with emergency management services.

The Trump Administration delayed the critical Risk Management Plan rule on June 14, 2017. This rule contained basic common-sense updates such as better coordination with emergency personnel and first responders, better information sharing with local communities and planners, and internal research into safer technologies that could reduce the impact of shutdowns and disasters. After a 3-year rulemaking process the final rule was published under the Obama Administration with an effective date of March 14, 2017. The Trump administration decided that these common-sense measures should then be delayed until February 19, 2019. The communities on the fence line of these chemical facilities, like Arkema, would have benefitted from this rule update and the Administration should rescind its final rule delaying the implementation of this critical public health and safety protection.

If the president refuses to implement the Risk Management Plan rule, Congress should take legislative action to modernize and strengthen chemical facility safety.

  1. Fund Science, Technology and Preparedness

If we had a time machine and could send the president back in it, do you think he might reconsider his proposed $667 million cut to FEMA state and local grants that support disaster relief?  Perhaps House Appropriators would also like to go back in time, seeing as they recently passed a budget that significantly cut NOAA’s satellite capacity and NASA’s earth science program—both of which are critical to our ability to forecast and prepare for disasters from storms and other natural phenomenon.

The FY 2018 budget provides Congress a real opportunity to build on lessons learned from Hurricane Harvey and make wise decisions to help protect communities and ensure taxpayer money is well spent. And with the House set to take up 8 spending bills this week (including the budgets of FEMA, NASA, and NOAA), now is a critical moment for Congress to learn the lessons from Hurricane Harvey. Here are some vital programs that Congress must fully fund or increase to help us prepare and protect communities from the next extreme weather event:

  • NASA’s Earth Science Program; 11% cut
    • The Earth Science Program of the National Aeronautics and Space Administration (NASA) develops, launches, and maintains a network of satellites that collect data on Earth’s surface and atmosphere. Scientists, researchers, and individuals across the country rely on this data, and the continuity of its collection, to understand, forecast, and respond to changes in land use, pollutant emissions, atmospheric chemistry, weather and climate, and other phenomena that define life on Earth. This data and research is critical to improving predictive capacity that impacts agricultural commodities, water management, infrastructure, risk assessment for the reinsurance business, public health and safety, and national security.
  • Reject the House cut to NOAA’s National Environmental Satellite, Data, and Information Service (NESDIS); 23% cut
    • NESDIS’s Environmental Satellite Observing Systems program procures, launches, and operates satellites that provide critical data and information products to scientists, weather forecasters, and first responders. NESDIS’ satellites provide 93 percent of the data used by the National Weather Service’s models and support high-resolution, near-real time weather and storm tracking. NESDIS also operates the Search and Rescue satellite system and monitors global sea ice conditions and other essential safety information.
  • Increase funding for FEMA Disaster Relief and Pre-Disaster Mitigation Programs (President Trump proposed cutting FEMA disaster preparedness funding by 61%)
  • Reject the House cut to NOAA’s Climate Research at their Office of Atmospheric Research (OAR); 19% cut
    • The OAR supports a network of laboratories, universities, and cooperative institutes across the country studying ocean acidification, aquaculture, severe weather, climate change, and other Earth processes. The Climate Research program studies short- and long-term climate trends and develops information and products for decision-makers and communities working to plan for and respond to climate change. We know that warmer air holds more moisture, and that warmer water make hurricanes more likely and more intense. The link between climate change and extreme weather is profound, and we cannot prepare for one without understanding the other.

Those who care about helping Texas and Louisiana recover quickly, and want to prevent the next disaster from taking a brutal toll on your community, should let your members of Congress know that funding the vital work mentioned above is important to you. Drop by your Congressional representative’s office to remind them, or give them a phone call. Don’t let Congress write a check for disaster relief while ignoring the science-based programs and technologies that help us forecast and prepare for extreme weather, keeping us safe.

Photo: Texas National Guard/CC BY (Flickr)

We Must Protect the Workers Who Will Rebuild after Hurricane Harvey

Workers make repairs in the aftermath of Hurricane Harvey. Photo: US Coast Guard/Petty Officer 3rd Class Nathan Cox

Storm waters in the greater Houston area are subsiding and the scale of devastation and destruction is staggering. The personal loss, pain, and suffering of families and impacted communities are immeasurable.

As the immediate crisis of saving lives and providing emergency aid and shelter to many thousands winds down, the daunting task of recovery, cleanup, and rebuilding of homes, businesses, and essential infrastructure begins. And, with my 25-plus years of work and experience in occupational health and safety, I am all too aware of the myriad hazards, exposures, and risks workers will be facing in this long-term effort.

Safeguarding workers’ health and safety must not be an afterthought.

The work: dirty, dangerous, and risky

Post-disaster recovery, cleanup, and reconstruction operations present a panoply of risks and dangers—with workers on the front lines.

Some workers will be tasked with the highly hazardous task of getting the area’s oil refineries and chemical plants back on-line. Start-up operations can result in uncontrolled releases and explosions that place the workers and surrounding communities at grave health and safety risk. The US Chemical Safety Board has issued a safety alert, urging caution and providing a checklist for evaluating systems, tanks, instrumentation, and equipment before start-up.

Other workers will be working in and around the 13 highly contaminated Superfund sites that have flooded and sustained storm damage. As of this writing, the EPA reports that 11 additional Superfund sites remain inaccessible to response personnel, so the extent of damage is unknown.

And many if not most workers in the greater Houston area will be doing jobs that, at least in the short term, only compound the well-recognized hazards, exposures, and risks they generally encounter.

Hurricanes and super storms like Harvey, Sandy, and Katrina just pile on additional hazards, including mold, mold, and more mold; water contaminated with chemicals and waste; working in and around unstable structures; and carbon monoxide poisoning due to the use of generators in poorly ventilated areas—an all-too-common event in post-disaster work. These are all on top of the falls, cuts, burns, amputations, and machine and musculoskeletal injuries that are all to frequent in today’s workplaces.  And silica, asbestos, and lead just add to the mix of dangers involved in demolition operations that will be ongoing in Houston. (You can also read my prior commentary on workplace injury, illness, and fatality tolls.)

The Occupational Safety and Health Administration (OSHA) has established protective health and safety standards for many of these hazards, and they remain applicable even during disasters.  Employers remain responsible for complying with these protections.

In the early days of a disaster, OSHA rightly focuses on compliance assistance (outreach, information, and training for employers and workers). But it should shift to enforcement as the immediate crisis passes. We have seen, for example, the consequences of a lack of enforcement of required respiratory protection after 9/11, leading to the illness and death of workers exposed to toxic dust.  Federal agencies have resources and information about these general hazards, as well as disaster-focused resources and information for employers, workers, and the public (including here, here, and here).

While helpful, information on a website is not enough; workers, communities, and the impacted public will need resources and action on the ground. And this will surely strain the capacity and resources of agencies that must continue to meet their existing responsibilities at the same time.

The workers

As they did in the aftermath of Katrina and Sandy, day laborers will comprise a significant portion of the clean-up and reconstruction efforts in Houston. Homeowners (already stressed by their losses) and contractors alike will be looking for workers to remove debris, pump out water, remove and remediate mold damage, and demolish and renovate structures.

Houston has a large population of day laborers and low-wage workers, many undocumented, many Latino. These workers—who will likely be the mainstay of Houston’s recovery efforts—face a host of additional challenges. Fearing discrimination, family separation, and even deportation, they may be unwilling to ask for protective equipment and training or report unsafe working conditions and wage and hour violations.

Health and safety research and statistics tell us that Latino workers experience higher rates of workplace injury and illness than other segments of the US workforce and significantly higher rates of workplace fatalities. According to the Bureau of Labor Statistics 2013 Census of Fatal Occupational Injuries, Hispanic or Latino workers were the only racial/ethnic group with an increase in workplace fatalities in 2013. The 797 Hispanic or Latino worker deaths constituted the highest total since 2008 and a seven percent increase over 2012.

Of course, in addition to day laborers, many other workers will be needed for the area’s long-term clean up and recovery efforts. They, too, will face many of the same hazards and exposures as their counterparts who are picking up day jobs on street corners and in worker centers (like Fey Justicia Worker Center in Houston). All of these workers will be in it for the long haul, and deserve the protection, training, and equipment they need to help their cities and communities recover and rebuild without sacrificing their own health and safety in the process.

Lessons learned?

We’ve seen what can happen in the aftermath of a disaster if we take our eye off the ball in protecting our rescue and recovery workers. Katrina, Sandy, 9/11—all have lessons to teach. Most are applicable to what’s facing workers who are on-the-ground now responding to Harvey and who will be there for years to come.

Avoiding the additional tragedy of workplace deaths, injuries, and illnesses in the recovery and reconstruction process will require the sustained attention of state, local, and federal agencies, employers and contractors, informed workers, and a vigilant public. This is made all the more difficult by the Trump administration’s all-out assault on workplace protections and his blatant ignorance and disregard for the value immigrants bring to our nation’s labor force.

Now is the time to ensure that our existing worker safety and health protections are strong and are vigorously enforced; it is not the time to roll back or weaken these worker protections—or environmental protections for communities.

Now is the time to ensure that our worker health and safety agencies have the staffing and budgets they need to fulfill their statutory mandates to protect the nation’s workforce. It is not the time to cut budgets for agencies already stretched to the limit in trying to do so.

Now is the time to honor, value, and protect those workers who rush into harm’s way in the face of disasters and those who stay with it to repair, rebuild, and revitalize the communities they hit. It is certainly not the time to turn the lives of immigrant and undocumented workers upside down.

Your voice matters. Let your elected leaders know that you expect them to support strong worker safety and health protections—and that you will hold them accountable if they don’t. Now is the perfect time. Here are some tips and resources that may be helpful.

Why Is Hurricane Irma Gaining Strength So Quickly?

In a world that is increasingly defined by superlatives, let’s start with this just-released statement from the National Hurricane Center: Hurricane Irma is the strongest hurricane in the Atlantic basin outside of the Caribbean Sea and the Gulf of Mexico in their records, and a potentially catastrophic one, tied in fifth place for strongest ever in the Atlantic. And it is following on the footsteps of Hurricane Harvey, which gathered strength very fast and dumped record amounts of rain on Texas and Louisiana.

If that doesn’t give you pause, I don’t know what would.

It is not just that records are being broken, it is the intensity with which they do so

Harvey underwent a very rapid strengthening from tropical storm to category 4 hurricane (about 48 hours), and gained strength as it approached landfall, as opposed to the usual weakening. It had so much rain associated with it that the National Weather Service had to create new colors for their precipitation maps in order to properly depict the amounts. These extreme rain events are becoming increasingly common, with a number of 500-year rain events (i.e., those with a probability of 0.2% of occurring at any one year) that should be rare happening all over the country in a matter of months (Houston alone has now seen three since 2015).

What is behind the intensity of these hurricanes, and the increase in precipitation observed in a variety of weather events?

Sea surface temperatures – the main hurricane fuel – have been on the rise

We know from studying hurricanes that many factors cause and drive them, but their main fuel is a warm ocean. Warm surface waters produce heat and water vapor. Hurricanes feed on and intensify from both, and the amount of rain they ultimately dump can be increased by both the higher availability of water vapor from the warm water and the fact that a warmer atmosphere can hold more of that available moisture (more on this here). Therefore, a trend of both increased intensity and rainfall associated with hurricanes can be expected, and in fact recent studies are in agreement with that (see here and here).

The waters off the coast of Texas when Harvey intensified from a category 2 to a category 4 hurricane were 2.7-7.2°F above average. The sea surface temperature where Irma was located as of the morning of 9/5 appears to be at least 2.7°F above average, which may have had a role in it turning into a category 5.

Should we expect hurricanes to be more intense then?

Studies have detected the influence of human-made global warming both on the near-surface amount of water vapor, and in sea surface temperature in the tropical Atlantic ocean, among other areas such as the Western Pacific and South Asia – and it is worth noting that China has recently had a series of powerful typhoons, Hato and Mawar being the latest ones to wreak havoc in that part of the world, among other areas.

Emissions already in the atmosphere have committed us to a certain amount of warming in the near future, and oceans have been absorbing about 93% of this warming. Therefore, oceans have also locked in a certain amount of warming from the increase in greenhouse gases from human activities. It follows that yes, we may be seeing more intense hurricanes, dropping a lot more rain when they do come ashore.

Preparedness is key!

Heeding hurricane warnings and the directives of authorities, and reducing risk as much as possible when a hurricane is approaching is the first priority. But also important in the long run is for individuals and all levels of government alike to learn from each event and from the latest science; build and upgrade in smarter ways now, reflecting the mounting risks; and when disaster strikes, rebuild in a way that improves community resilience for the long term, and for next big one. Because the next big one will come. It’s just a question of when and where.

Science Advice in Action: Highlights from an EPA Science Advisory Board Meeting

Here at the Center for Science and Democracy, we have been writing a lot lately about the importance of federal science advice and defending the value of advisory committees like the EPA’s Board of Scientific Counselors and EPA Science Advisory Board (SAB) against threats of budget cuts and reform. When I saw that the full SAB would be meeting at the end of August, I jumped at the opportunity to attend so that I could share some highlights of the proceedings and go beyond the static information that meeting transcripts convey.

As the EPA administrator’s primary science advisors, the purpose of this SAB meeting was for the group to meet in person to discuss some ongoing work. This included a review of a draft SAB report on economy-wide modeling of the benefits and costs of environmental regulation; a review of a draft SAB report on the framework for assessing biogenic carbon dioxide emissions from stationary sources; and a review of a draft review of EPA’s draft risk assessment on a munitions chemical, Hexahydro-1,3,5-trinitro-1,3,5-triazine, known as RDX. That’s a lot of reviews! The SAB also heard a briefing from the EPA’s Center for Environmental Assessment and the Integrated Risk Information System (IRIS) on how they are operationalizing systematic review to increase transparency, efficiency, and access to risk assessment products.

The 47 members, both in the room and on the phone, were ready to dive into conversations that for many of them, were slightly outside of their bailiwicks. Expertise in the room ranged from agricultural economists to pulmonary toxicologists to civil and environmental engineers, taking on issues running the gamut from the viability and longevity of economy-wide modeling and what research questions would help to improve EPA’s ability to predict the economic benefits of its regulation, to the importance of training EPA staff to conduct better and more standardized risk assessments without adding years onto their workloads.

Transparency and accessibility at the forefront

A strong commitment to transparency was a theme throughout the meeting. The first matter of business was to announce that a couple of members would be recusing themselves because of a potential conflict of interest. One of these members is affiliated with ExxonMobil and therefore removed himself from the discussion about biogenic carbon dioxide emissions, since Exxon is likely to be financially impacted by policy decisions on the matter. With clear disclosure of and checks on financial conflicts of interest, and a balanced composition, federal advisory committees like the SAB are able to operate harmoniously with members with a range of affiliations, including industry, the nonprofit sector, state government, and academia.

Accessibility is also a major consideration for the SAB, and they seem genuinely interested in making their reports easier for their general readership to digest and improving the transparency of their processes. The SAB not only was interested in the way that their own reports were used by the public, but their commitment to accessibility was clear in their recommendations to the agency as well.

As a member of the public, there was no registration required, but there was a sign-in sheet before entering the conference room and plenty of room to sit and watch the meeting of great science minds. There was even coffee made available for all! Public attendees came and went as the day wore on, but at most times there were about 20 people in the room, most of whom were interested EPA staffers eager to hear advice from these trusted counselors.

The meeting began promptly in the morning and ran through the packed agenda smoothly, thanks to the Designated Federal Officer’s leadership—he’s the EPA staff person who is responsible for ensuring that the committee is operating according to the Federal Advisory Committee Act—and the SAB chair, who leads the committee through the agenda and facilitates the meeting.

Discussions on the different agenda items were organized and thoughtful, and there was opportunity for public comment on each piece so that interested parties could give the SAB their feedback. Several members of the public had comments on the biogenic emissions report, including some critiques of the short timeline given to the public to review the document. The opportunity for the SAB to hear from members of the public and refer to their testimony throughout the proceedings is incredibly valuable and a testament to the importance of these public meetings on a regular basis to keep the advisors grounded.

A few major takeaways

The day-and-a-half meeting was chock-full of acronyms (both for agency offices and for indecipherable chemical names) and highly technical scientific jargon, and admittedly not the easiest for a member of the public to access fully without a 2-month primer on the ins and outs of biogenic emissions. However, attending this meeting allowed me to fully understand what it means to have a group of the country’s premier scientists in one room discussing scientific issues. It’s an all-star team of brilliant minds. At the end of the day, the EPA administrator could feel the utmost confidence that questions posed to this group have been carefully considered and conclusions come to with pointed discourse.

These scientists certainly do not always agree, but they talk about their disagreements and call each other out if they feel that a characterization is incorrect or off the mark in some way. They try to look at each charge question from all sides and from all disciplines, so nothing is left out of the equation. They respect one another as colleagues and as friends. At the end of the last day, there was a touching toast to celebrate the SAB chair, whose term will be coming to an end in September.

Appointments to the SAB and other federal advisory committees are taken seriously by those afforded the opportunity, and it’s no wonder. What better way for scientists to challenge themselves in a new setting by applying their expertise to support federal policy?

What is frustrating, however, is that Administrator Pruitt has so far not been involved with the SAB besides sending them a spring 2017 regulatory agenda, which includes EPA regulatory and deregulatory actions, for the SAB to review. The SAB plans to share some of their priorities with Pruitt and invite him to the next full board meeting. Seeing as Pruitt has already been a party to several incidents of sidelining science, including his plan for a “red team/blue team” exercise on climate change, he could clearly benefit from some science advice that doesn’t come directly from special interest talking points.

After hearing from the EPA’s IRIS division director and learning about how quickly staff has been implementing National Academies of Science recommendations to strengthen its review process, there was a motion from SAB members to inform Pruitt of the value of IRIS. This is especially timely given that this week, the House Science, Space, and Technology Committee, led by longtime IRIS critic Representative Lamar Smith, will hold a hearing to review the “integrity” of the division.

I would hope that those who have been critical of the SAB would attend a meeting so that the notion that these scientists are “conflicted” is laid to rest. Conflicts of interest are taken care of at the outset of the meeting, and the rest of the conversation has nothing to do with employer or political party or who’s occupying the White House. The only thing that matters to these individuals is being scientifically accurate and helping the agency to meet its mission. So next time you hear something or read something that categorizes SAB members as having agendas or lacking scientific integrity, try to remember that this job is actually just about the science. Let’s keep it that way.

3 Reasons Why You Should Care About Vehicle Efficiency and Emissions Standards

Merely typing “vehicle efficiency and emissions standards,” feels like I’m prompting you to click off in search of the latest cat meme or 8,000th story on President Trump. But the next battle in the war for better vehicles looms, and you can help defend against automaker efforts to rollback a program they agreed to not so long ago.

Here are the top 3 reasons why you should care about the U.S Environmental Protection Agency (EPA) “Request for Comment on Reconsideration of the Final Determination of the Mid-Term Evaluation of Greenhouse Gas Emissions Standards for Model Year 2022–2025 Light-Duty Vehicles” (aka federal vehicle efficiency standards) and what you can do about it

Vehicle efficiency standards save money for all Americans, but especially low- to middle-income earners

Researchers at the University of Tennessee analyzed 34 years of consumer spending data and found that not only did households from all income levels save money because of improved vehicle efficiency, but low- to middle-income households saved a greater percentage of household income compared to higher earners. Better fuel efficiency saved an average middle-income family as much as $17,000 over the study period – even after households paid more for new and used cars equipped with fuel-saving technology. Vehicle efficiency standards, the researchers concluded, are therefore a true progressive (as opposed to regressive) policy because they benefit lower earners more than higher earners.

Interested in more of these findings? Check out this UCS fact sheet.

Without fuel efficiency standards, automakers would only make gas guzzlers

Free market advocates argue that fuel efficiency standards aren’t necessary. If there is demand for fuel efficient vehicles, then automakers will create a supply to meet that demand. While that sounds good in theory, in practice it doesn’t happen.

In the absence of federal standards, fuel efficiency largely stagnated (see below) and automakers proved reluctant to offer fuel efficient options outside of small sedans.

In response to the 1973 oil embargo, Congress established fuel economy standards for new passenger cars in 1975, then again in 1978. These standards were intended to roughly double the average fuel economy of the new car fleet to 27.5 mpg by 1985. No fuel efficiency standards passed until 2007, when Congress set a target of least 35 miles per gallon by 2020, and required standards to be met at maximum feasible levels through 2030. The standards now at issue cover vehicle model years out to 2025. Source: EPA 2016 Fuel Economy Trends Report. Appendix D: Fuel Economy Data Stratified by Vehicle Type. Available at, https://www.epa.gov/fueleconomy/download-co2-and-fuel-economy-trends-report-1975-2016

But Americans largely don’t want small sedans. We want SUVsand fuel efficiency! Fortunately, the vehicle efficiency standards incentivize automakers to make vehicles across all classes – including SUVs, pickup trucks, and minivans – more efficient. Because the standards do not require automakers only to make small, ultra-efficient vehicles, they prompt automakers to create innovative technologies that boost the fuel-saving performance of the larger vehicles that Americans tend to prefer.

For example, the 2017 Toyota Highlander Hybrid, a full-size SUV, gets a combined 29 miles per gallon. That’s what I average in my mid-sized 2012 Subaru Outback Sport. Not too long ago, the 2001 Highlander only got a combined 18 mpg and the 1995 4Runner (the Highlander predecessor) got 13 mpg. And, the standards are incentivizing automakers to develop electric vehicles. There are growing numbers of electric vehicle models and several auto companies are set to release full electric SUVs in the next several years.

By providing automakers with flexible ways to comply with the standards (aka compliance pathways), the federal vehicle efficiency program has been instrumental in giving consumers more fuel efficient choices no matter what sort of vehicle they need.

Vehicle efficiency and emissions standards are the single most important federal climate policy

I’m guessing that you care, at least tangentially, about climate change. You are reading a blog from the Union of Concerned Scientists, after all. So, you should know that the standards are set to achieve the largest reduction in global warming pollution from a single federal policy (other than the Clean Power Plan, which is mired in legal trouble and threat of repeal from the current Administration).

Transportation is one of the biggest sources of global warming pollution in the U.S., having accounted for 27 percent of emissions in 2015. Cutting emissions from transportation is challenging as our nation continues to rely on personal vehicles and driving has become incentivized by relatively low gas prices and may become further incentivized by the introduction of autonomous driving features. 2016 had the largest increase in national vehicle miles travelled (VMT) since regulators began tracking this data in 1971 and doesn’t show any sign of slowing down. More cars were sold in 2016 than ever before, adding to the 263 million registered vehicles on American roads.

Transportation is one of the biggest sources of global warming pollution in the U.S. Source: EPA Inventory of U.S. Greenhouse Gas Emissions and Sinks: 1990–2015. Table ES-6. Available at, https://www.epa.gov/ghgemissions/inventory-us-greenhouse-gas-emissions-and-sinks

That’s why – along with electric vehicles, better biofuels, and better transit options – improving the fuel efficiency of vehicles is so important. When including the emissions reductions from the finalized standards for heavy-duty vehicles, the federal fuel efficiency programs will cut emissions by an estimated 550 million tons in 2030 alone. That would be a reduction of over 3 percent of today’s transportation-related emissions and would achieve more reductions over time as the vehicle fleet turns over and gradually becomes more efficient.

How you can help protect the federal vehicle efficiency and emissions standards

UCS is leading the way on telling the EPA and Department of Transportation that consumers want to stick with the current standards. Not only are the standards cost-effective and feasible to meet, the agencies’ research showed that automakers could even exceed them. Help protect standards that are savings Americans money at the pump and reducing the risks of climate change.

Head on over to the UCS Action Center for a couple easy actions you can take, including


Protecting New Jersey From Sea Level Rise: The Future of the Meadowlands

A case study part of the report When Rising Seas Hit Home

Looking out over the vast marshy wetlands that stood between them and the Manhattan skyline shimmering in the background, Hugh Carola and Michele Langa of the conservation society Hackensack Riverkeeper called the Meadowlands the salvation for several cities and towns in New Jersey during 2012’s Hurricane Sandy.

“A buffer that literally means life and death”

On a driving tour to various shoreline spots, Langa, an attorney for the group, pointed back to two homes that still showed subtle water stains from the crest of Sandy’s floodwaters. “This is where the marsh saved this area,” she said. “The violence of Sandy’s water was absorbed by the marsh and rose slowly against the houses. If they had been hit by the same surge as the outer boroughs of New York and the Jersey Shore, they would have been gone.”

More important, according to The New York Times’s interactive mapping of the death toll, only one person died in the Hackensack River region—a 69-year-old man who reportedly stumbled and drowned after leaving his inundated car—compared to the clusters of fatalities along shorelines.

“It could have been so much worse,” said Carola, the Riverkeeper’s program director. “The wetlands are a buffer that literally means life and death.”

Michele Langa and Hugh Carola of Hackensack Riverkeeper are working to protect the wetlands, which covered 21,000 acres before World War II. Only 7,000 acres remain today.

A new challenge dawns

It is obvious what has to be done to continue to protect this area: preserve the 7,000 acres of wetlands that remain of the 21,000 acres the region had before World War II and the 14,000 acres it had as late as the 1990s. Most people know the area for the giant Meadowlands stadium complex, which plays host to the New York Giants and New York Jets professional football teams; the adjacent wetlands are a powerhouse of different sort.

The preservation efforts of groups such as Hackensack Riverkeeper have kept the remaining marsh off-limits to development, with many benefits. Despite high levels of mercury and other heavy metals left behind by industrial activity that have some advocates wanting the area declared a Superfund site, the clean up efforts to date have allowed a rich array of wildlife to return to the watershed. One bald eagle pair recently fledged three chicks. Owls and osprey hunt, butterflies flutter, and a rare calliope hummingbird was spotted and photographed in a yard in Secaucus.

Black Skimmers in Hackensack.

To emphasize his point, Carola drove to a boat dock where there was a flock of black skimmers, a tern-like black and white bird with a striking red and black beak. It is listed as endangered in New Jersey. “Last year we had thousands of people kayaking and riding pontoons and had 1,000 people participating in river cleanups, from pre-K to senior citizens,” Carola said. “We had garden clubs, church groups, school groups. We think people are getting the message that the flowing waters of our nation belong to everybody.”

That message is getting through just as a new challenge dawns on the region in the form of rising seas caused by climate change. The area has begun to experience more flooding at the highest tides. “I talk to senior citizens a lot and they tell me about times that streams rose slowly during rains,” Carola said. “Now, rains bring them right up to flood stage.”

If emissions continue to rise through the end of the century, sea level is projected to rise more than 6 feet by 2100. In this scenario, the same areas of northern New Jersey and New York City that were flooded by Hurricane Sandy’s storm surge would be inundated more than 26 times or more per year, or every other week on average. When Rising Seas Hit Home, 2017.

Those with the least resources to cope get hit the hardest

Just as concerned as Carola about high-tide flooding is Hernan Lopez, the emergency management coordinator for Carlstadt, New Jersey. With some of its borders abutting the Meadowlands stadium complex, the borough is one of the most important towns in the New York metropolitan area that few have heard of, according to Lopez. It has 6,000 residents and 40,000 daytime workers in a wide range of industries, from chemicals to computer backup, from apparel to glassmaking.

When Sandy blew through, Lopez said the town lost a lot of trucks and equipment and the ensuing water damage temporarily shuttered about 100 businesses, affecting 5,000 jobs. On a driving tour, he pointed out the side of a produce distributor. “There were fish everywhere,” Lopez said. “It stunk for weeks.”

Many of the bigger and more financially stable businesses are back, some raising their foundations three feet and generators six feet. But several mom-and-pop operations closed for good, highlighting a common theme when disaster strikes: those with the least resources to cope get hit the hardest.

On the driving tour, Lopez pointed out the site of an apparel company owned by immigrants. Their boxes of clothes had been stored outside at ground level, and the company was ruined during Sandy, when there wasn’t enough room or time to get them inside.

“I talk to senior citizens a lot and they tell me about [past] times that streams rose slowly during rains,” Carola said. “Now, rains bring them right up to flood stage.” Tidal flooding in Carlstadt.

Lopez said he notices on an anecdotal level that water lingers longer in the streets after heavy rains. A new tidal gate was christened in 2014, but Lopez said it takes more than a gate to hold back the water. He drove to a location where the backyard of several businesses is the marsh.

“One concern of mine is with the water getting higher, it makes it more important to stop illegal dumping in our marshes and waterways,” he said. “The debris clogs the drains and makes the water build up.”

Because of Sandy, the federal government awarded the Meadowlands area a $150 million grant to devise plans to reduce the risk of floods and improve storm-water drainage. There are several proposals currently being debated, including one to attempt to wall out the water. Groups such as Hackensack Riverkeeper are lobbying for solutions that focus on making the wetlands more resilient than resistant by improving the drainage ability of the wetlands and increasing open space and green infrastructure.

While the region’s uniformly low elevation presents a major long-term challenge in the face of sea level rise, having more open space could be critical to the future health of the wetlands. Wetlands that have the space to migrate may keep pace with sea level rise rather than be drowned by rising waters.

“We’re totally focused on natural ways to deal with the future,” Carola. “It was nature that saved us from the worst effects of Sandy.”

Hurricane Harvey Magnifies Climate and Petrochemical Toxic Risks for Environmental Justice Communities in Houston

The ExxonMobil refinery in Baytown, TX. Photo: Roy Luck/CC BY 2.0 (Flickr)

The last few days have been difficult for people in Houston. Though they have been difficult for everybody, the environmental justice communities like Manchester and Galena Park in eastern Houston are now facing a double whammy of climate change and toxic contamination. Adding to the toll of human suffering, death, loss of livelihoods, and dislocation that Hurricane Harvey is leaving in its wake, environmental justice communities in Houston are now facing threats from the many petrochemical facilities that already expose them to acute and chronic air toxics emissions. Over the last few days, members of the community have been sending out reports of (worse than usual) smells, itchy throats and eyes. And there are already reports of chemical spills in the area.

In preparation for Hurricane Harvey, many petrochemical companies performed controlled shutdowns of many facilities, releasing large amounts of contaminants to the air, including benzene, toluene, carbon monoxide, butane, nitrogen oxide, ethylene, and other toxics. In other cases, tank rooftops and other structures collapsed due to the heavy rains.  Many of these acute air pollution events occurred along the Pasadena Freeway where many of the most vulnerable populations in Houston live (but many other shutdowns and malfunctions were reported in Goldsmith  and Old Ocean).

The map below shows facility incidents reported to the Texas Commission on Environmental Quality’s Air Emissions Event Reporting database for August 20-29, 2017. The incidents in this map were due to planned shutdowns and other contingencies related to Hurricane Harvey. For example, a roof in the Exxon Mobil refinery in Baytown collapsed due to excess rain, and the company estimated that around 12,000 pounds of volatile organic compounds (VOCs) were emitted into the air.

If this wasn’t an already dire set of hazardous circumstances, threats such as those from the likely explosion of the Arkema plant that manufactures and stores the very volatile compound organic peroxide endangers the lives of Houston area residents. It’s already well known that low-income communities of color are more likely to live near facilities that process or store highly hazardous chemicals. It’s no wonder then that environmental justice advocates have correctly characterized the Hurricane Harvey disaster as the outcome of extreme weather fueled by climate change, failed chemical safety policies, and hazardous petrochemical facilities that have left Houston residents swimming in a “toxic soup.

Which begs the question—why are there no protections for these communities? It’s not due to lack of information or knowledge about these threats. The evidence is clear that climate change is making storms stronger. It’s also clear that low-income communities of color in Houston and elsewhere are more exposed to toxics from the petrochemical  industry. We know because the Union of Concerned Scientists partnered with the environmental justice community in Houston to document the disproportionate burden of toxics in their communities.

EPA scientists also know how industries that manufacture, process, or store dangerous chemicals can be more transparent in publicly sharing information on chemical inventories, improve accident prevention plans, data management, and federal coordination.  But the Trump administration continues to sideline federal science by delaying the implementation of amendments to the Risk Management Program (RMP) Rule.

The Arkema facility is a case in point, as it is one of many facilities that under the now-delayed RMP amendments would have had to research safer technologies and chemicals to limit the effect of a disaster or explosion on surrounding communities and workers.  The RMP rule would have also required this facility to begin coordinating with local emergency responders to make sure they have necessary information before heading into a volatile facility.

It is long past time for communities near industrial facilities to receive the justice they deserve. Industry must be responsible for their action, or too often inaction. And governments at all levels from local to federal must ensure that all companies follow the rules and are held accountable for the damage they do, not just when a storm occurs, but every day. No more excuses.

Want to help?

Donate directly to Texas Environmental Justice Advocacy Services (t.e.j.a.s.), an Environmental Justice organization in Houston in the frontline of Hurricane Harvey, toxics, and climate change.

What’s the Connection Between Climate Change and Hurricane Harvey?

First responders, neighbors, volunteers, city, county, state, national and international institutions, businesses, and people around the world are heeding the call to help save lives and provide resources for keeping people safe throughout the onslaught of Hurricane Harvey and aftermath.  Even as the storm slipped back offshore of Texas and then like a pinwheel spun back over to Louisiana and is now moving further inland on a northeast trajectory, questions are already being asked:

  • Is this storm unprecedented?
  • Are there telltale signs of climate change?

    Playing baseball with lead in the bat.

Before jumping into noteworthy aspects of Hurricane Harvey, an analogy I heard from NPR science reporter Christopher Joyce comes to mind.

“A scientist once told me about climate and weather, said it’s kind of like playing baseball with a bat with lead in it. You know, you’re going to go out there, and you’re going to hit foul balls, and you’re going to hit grounders, and you’re going to strike out. But every once in a while, you hit that sweet spot with [sic] that leaded bat, and it’s not just going to knock the ball into the stands. It’s going to knock it out of the park.”

The analogy gets to the “fat tail” aspect of extreme events.   A scorching heat wave or torrential rain are more likely now than before after the average background conditions shift a little (see Fig 1-08).

Hot air and rain intensity

Scientists know that warmer air holds more moisture.  That moisture can be taken up by storms resulting in extreme rainfall events.  We saw evidence of this in the increase in volume of water dumped during the most intense downpours of a year in the Continental US.

Hot oceans and storm power

A small shift can make a big difference in the severity of rare extreme temperature and precipitation weather events. Intergovernmental Panel on Climate Change fifth assessment report Working group 2 figure 1-08

This heavy precipitation can be further fueled as climate change warms oceans. Hurricane models that compare the past three decades with further climate change (RCP 4.5) toward the end of the century show an increase of average hurricane intensity, precipitation rates and the number and occurrence of days with intense category 4 and 5 storms.

How does it work? If a tropical depression forms, and conditions prove favorable to grow into a Hurricane, warmer seas can increase the power of a storm primarily through evaporation of the hot seawater and other processes.  When conditions get too hot in the tropics, the ocean tends to shed that excess heat away from the tropical surface ocean as fast as possible.    It is as if the tropical ocean ‘sweats’ in the summer and tropical depressions, storms or hurricanes, are dramatic ways to transfer that excess heat away from the tropics.  We saw a similar scenario with Hurricane Harvey, which quickly gained intensity—moving from a tropical depression to a category 4 hurricane —as it passed over the unusually warm Gulf waters in the days before it made landfall.

Sea level rise increasing damages

Local sea level can influence flooding in several ways.  Tropical storms and hurricanes can blow so hard that they literally pile water up onto the shore.  Storm surges today are more hazardous in low-lying coastal regions compared to the same storm surge a century ago.  Due in large part to climate change, seas are on average 8 inches higher since 1880.   Galveston, Texas experienced around a foot of sea level rise in just the last 50 years due to the combination of climate change and sinking coastal land from groundwater pumping, oil and gas extraction and natural causes.

In addition to storm surge, intense hurricane precipitation can swell the rivers that are draining into higher seas during a storm, which can back up and cause a dangerous situation as rivers and reservoirs overtop their banks and flood adjoining land, something we observed during Hurricane Harvey.

Hurricane Harvey: an epic event

All these factors were in play in Hurricane Harvey:

Data are being collected for investigators to tease apart the relative contributions of various factors likely to have influenced Hurricane Harvey.  Typically, extreme weather scientists investigate the weather and climate change factors for hazards.  If better data are available on the infrastructure, land use decisions, historic inequities, etc. that influence exposure and vulnerability to a weather hazard, these can be incorporated into more comprehensive investigations of impacts risk (see AR5 WG2 figSMP-1).

As  we move into recovery and rebuilding, it will be important to understand all the major factors so Houston can rebuild in a way that  better protects lives and property.

Modified Wikipedia Commons Image IPCC AR5 WG2 Fig 1-08

As Arkema Plant Burns, Six Things We Know About Petrochemical Risks in the Wake of Harvey

As Harvey continues to wreak havoc in the Southeast, one issue is starting to emerge as a growing threat to public health and safety: Houston’s vast oil, gas, and chemical production landscape. We’ve already seen accidental releases of chemicals at facilities owned by ExxonMobil, Chevron, and others. Now we are seeing explosions at Arkema’s Crosby facility 20 miles northeast of Houston, due to power failures and flooding. And there remains a threat of additional explosions.

There is no reason to believe the Crosby facility is the only one at risk of chemical disasters right now. The coast of southeast Texas and Louisiana has a whole lot of petrochemical production—infrastructure that was exactly in the path of Hurricane Harvey and continues to be hit by its remnants. I’ve studied (and been worried about) chemical safety, sea level rise, and storm surge risk to oil and gas infrastructure in the Gulf for several years, and many of those fears are now playing out. Here are some things we know about petrochemical production in the Gulf, its storm risks, who’s impacted, and who’s responsible.

1. Arkema’s Crosby Plant explosion risk shows the danger of President Trump’s chemical disaster policy.

Back in June the Trump administration delayed the EPA Risk Management Plan rule until February 2019. Finalized in January 2017, the new EPA RMP rule is designed to enhance chemical risk disclosure from companies, improve access to risk information for emergency responders and the public, and push companies to consider safer alternatives and preventative measures. Yet, the rule is now delayed more than a year, thus delaying the time when companies will be preparing for and disclosing details about their chemical risks and safety measures.

Today’s explosions at Arkema are exactly the reason that chemical safety policies are so important. We are witnessing in real time the confusion that results from limited access to chemical safety information in an emergency situation. As the situation at the Crosby facility (Tier 2 under the RMP) continues to unfold, the media, decisionmakers, law enforcement, and the public are scrambling to figure out what chemicals are on site and what public safety risks remain for nearby communities and first responders. The facility’s current risk disclosure is limited in detail. This shows that the improved RMP rule is sorely needed. The rule’s needless delay is putting public safety at risk.

2. Oil refineries are routinely damaged in major storms, climate change makes that worse.

You know how people tell you to buy gas when there’s a storm in the gulf? The advice isn’t unfounded. A significant portion of US oil refining capacity is in the Gulf of Mexico, and refining operations are shut down days in advance of storms. At least 22% of US refining capacity is offline right now thanks to Harvey. Such shutdowns disrupt major supply chains and affect gas prices across the country, for weeks or months after. Refineries are especially vulnerable to storm impacts, with 120 oil and gas facilities within ten feet of the high tide line. Oil and other chemical spillage is now expected during major storms. When Superstorm Sandy struck in 2012, for example, Phillips 66’s Bayway refinery in Linden, NJ, spilled 7,800 gallons of oil.

Under a changing climate, this vulnerability worsens for a few reasons. The Gulf Coast faces rates of sea level rise that are among the highest in the world, partly because segments of land in the region are subsiding. This means storms ride in on higher seas, able to worsen storm surge levels. Further, thanks to warmer sea surface temperatures, climate change is expected to increase Atlantic hurricane intensity, meaning the storms that do form could be more damaging, with higher winds and greater rainfall totals. Yet, we’ve seen the Trump administration repeatedly roll back, walk back and revoke policies in place to help us mitigate and manage these types of risks.

3. Refinery spills and leaks harm surrounding communities.

After Murphy Oil’s Meraux refinery spilled 25,000 barrels of oil during Hurricane Katrina, more than a square mile of neighborhood was contaminated and Murphy Oil had to pay $330 million in settlements. Photo: UCS/Jean Sideris

What makes the above points more concerning is the fact that people live close by. Like really close. Thanks to a lack of zoning laws and environmental racism, people in Houston live at an eyebrow-raising proximity to oil and gas infrastructure.

Take the Manchester Community, for example, sandwiched between the major roadways, a rail yard, and oil and gas infrastructure. A report from the Union of Concerned Scientists and Texas Environmental Justice Advocacy Services (T.e.j.a.s) last year found that low-income communities and communities of color face disproportionate impacts in Houston when it comes to chemical safety risk and exposure to harmful industrial emissions. The report found large disparities between Houston communities in terms of overall toxicity levels from chemical exposures, showing that toxicity levels from exposures in Manchester are more than three times higher than in more affluent and white Houston communities.

Low-income communities and communities of color are routinely exposed to adverse impacts from refinery emissions, and in storms they are also likely to bear the brunt of the impacts, with less money and attention spent on their recovery. We need to make sure that communities most impacted by this tragedy get a fair share of the recovery efforts. If you’d like to donate in a way that supports these communities, a list of organizations is here. And if you’d like to support T.e.j.a.s. and the Manchester community, you can do so here.

On top of this, we know that chemical spills are likely to be a more lasting and problematic element to the recovery than the water alone. Cleaning toxic chemicals out of houses is no easy or cheap task. After Katrina, for example, damaged tanks at Murphy Oil’s Meraux refinery spilled 25,000 barrels of oil, covering over a square mile of neighborhood and contaminating 1,700 homes. Murphy Oil paid $330 million to settle 6,200 claims, buy contaminated property, and perform cleanups. One graffiti sign in a nearby neighborhood read “Damaged by Katrina, Ruined by Murphy Oil, USA.”

4. Refineries are regulated, but that hasn’t been enough.

Where there are refineries, there’s exposure to toxic chemicals. Refineries are of course subject to regulations designed to protect people from harmful levels of pollution; however, disasters like Harvey show the inadequacy of those regulations. Refineries often have spikes in emissions that can expose nearby communities to unsafe levels of pollution. This happens on a normal day. During a storm event with flooding or wind impacts, these chemical releases are more likely. First, many refineries shut down in advance of approaching storms. This sounds positive, and emissions will stop, right? In fact, this can mean (and has, in the case of Harvey) a lot of  flaring that releases chemicals into the air.

On top of this, the state has temporarily shut down its air monitoring, leaving it to companies to self-report to nearby communities, a situation that likely means communities have limited access to air quality information. Ramping up refineries after a shutdown also comes with risks of explosions and unsafe emissions. The Chemical Safety Board, which monitors and advises industry on chemical safety issues, has urged companies to exercise caution and follow protocol closely when starting up refining operations after the storm, given the higher accident risks during startup. The Chemical Safety Board, by the way, was on the chopping block in the President’s recent budget. Without it, an investigation into what happened at Arkema’s Crosby plant and how to prevent it in the future will never happen.

5. Companies should be held responsible for any spills that happened.

Public companies are required to report and prepare for any material risks their businesses face, including extreme weather and climate change. Companies don’t always do a good job of disclosing this information, however. As a result, the public doesn’t know much about what risks companies have and what they are doing to mitigate those risks.

Indeed, investors, communities, and public interest groups have long been demanding that companies do a better job of disclosing such risks. In 2015, a group of investors and the Union of Concerned Scientists asked Exxon Mobil, Chevron, Marathon Petroleum, Phillips 66, and Valero to improve their disclosure of climate and storm risks. Some responded, but since then we’ve only seen modest improvements in company disclosure of such risks at their refineries.

In Exxon Mobil’s 2016 risk disclosures, for example, the company wrote that its facilities are “designed, constructed, and operated to withstand a variety of extreme climatic and other conditions, with safety factors built in to cover a number of engineering uncertainties, including those associated with wave, wind, and current intensity, marine ice flow patterns, permafrost stability, storm surge magnitude, temperature extremes, extreme rain fall events, and earthquakes.” Yet we learn few other details about what conditions they are thinking about and what preparations are in place to mitigate those risks. The irony, of course, is that these companies are dealing with the consequences of climate change, as they are continuing to contribute to it through emissions of their products.

6. None of this is new.

In many ways, Harvey is unprecedented. Its levels of rain boggle the minds of even the most imaginative meteorologists. But we had all the information we needed to prepare for this event. The hurricane was forecast far in advance. We knew that climate change stands to make such events worse. We knew how to prepare communities and infrastructure for major storms. We knew why we shouldn’t build infrastructure in floodplains and why we shouldn’t design infrastructure that can’t withstand wind and water risks.

Yet, we live in a world where our president revokes policies that ensure our infrastructure is storm ready, where climate mitigation efforts have stagnated, and where disaster relief efforts often don’t reach those that need it most.

We must do better. We must remember the tragedy that is unfolding before us right now and use it to advance policies that ensure that future events like this are less likely and less damaging when they do happen. Our nation’s future depends on it.

Global Solutions Start at Home

Electric vehicle charging stations line the perimeter of San Francisco's City Hall. Photo: Bigstock.

“Think globally, act locally.”

I first heard this phrase as a child who had just learned about Earth Day at school. To my 11-year old self, it felt empowering; I could help the environment by recycling and conserving water. While the idea of taking action to solve pressing problems continued to inspire me through to adulthood, I’ve recently come to fully appreciate the importance of local action.

Keep Hayward Clean & Green Task Force. Photo credit: City of Hayward

After I had spent several years in Washington, D.C., first as a legislative adviser in Congress and then as an analyst in the White House, life brought me to California where I now work as a technologist. Missing my connection to government, I successfully applied for an appointment to a standing task force in my city, Hayward, California. During this time, part of my technology work has included developing data-enabled solutions for distributed renewables such as rooftop solar and on-site batteries. While I left federal government believing I was leaving behind the ability to significantly impact policy issues that mattered to me, the intersection of my professional and my community work have shown me the importance of local government engagement.

Reflecting on my local engagement, two themes have emerged:

Local governments play a critical role in science-related policy issues, including those with global implications.

We often think of the most pressing science-related policy issues, such as climate change policy, as being national (or global) in nature. While many important policy decisions are made at the national level, local government can also play a significant role both as a testbed for new policy ideas and as the implementation arm for high-level policies.

An example of this is vehicle electrification, a solution advocated by Union of Concerned Scientists (UCS) as part of a broader environmental and climate change mitigation strategy. At the national level, the Obama Administration set goals for expanding the number of electric vehicles (EVs) on the road. At the same time, states like California have been pioneering efforts to reduce emissions and encourage vehicle electrification. Municipal and regional governments provide the critical “last mile” for a comprehensive policy strategy. At this level, government policies can be as diverse as switching municipal fleets to EVs, ensuring the availability of charging stations in public garages, incentivizing or requiring EV-charging access in building codes, or implementing special permitting fees for EV chargers. To influence these crucial “last mile” policies, you must look to your state houses and city halls instead of to Washington.

There are few resources to help people, particularly engineering & science professionals, who want to get involved in their local communities…but we are trying to change that.

Once you appreciate the importance of local engagement, you may find yourself wondering where to start. While scientific professional societies have provided a conduit for informing engineers & scientists about national level policy issues for decades, fewer resources exist for helping people understand the issues and get involved in local (city, state, or regional) government. While a limited number of state-level fellowships provide the opportunity for a small number of engineers & scientists to work in state government, there are even fewer municipal programs.

You can, instead, create your own engagement opportunities. Visit your city’s website to read your general plan or learn about initiatives, attend city council meetings, request meetings with your city representatives—who are usually happy to meet their constituents—and apply for a board or commission. The latter is a particularly impactful, though too often overlooked, way to engage. Through my own task force involvement, I have had the opportunity to meet with leaders in my community and learn about issues ranging from gang prevention to compliance with Environmental Protection Agency regulations.

Despite its possibilities, local involvement seems to be the exception for people who are interested in science-related policy. Most of the scientists I know are unaware of how they can become involved locally and don’t realize they can have an impact. For this reason, I founded Engineers & Scientists Acting Locally (ESAL).

ESAL is a non-partisan, non-advocacy organization dedicated to helping engineers & scientists increase their engagement in their city, state, and regional governments and communities. We are currently assessing interests and engagement levels of engineers and scientists. If you are a scientist or engineer, please share your interests and experiences with us through this survey.

The work of organizations like UCS helps engineers, scientists, and members of the broader public understand the critical role that science and technology plays across policy issues. This awareness has made technically informed discussions an integral part of policy formulation at the federal level. Local governments also grapple with important science-related issues. By getting involved as an engaged citizen, advocate, and volunteer in your local community, you can help shape local policies that align with global solutions.


Arti Garg is the Founder and Chair of Engineers & Scientists Acting Locally (ESAL). She is a data scientist who specializes in industrial and internet of things (IoT) applications. Previously, she worked in the White House Budget Office overseeing a $5 billion portfolio or research and development investments at the Department of Energy. She also served as an American Physical Society-sponsored science policy fellow with the House Foreign Affairs Committee. She was appointed to the Keep Hayward Clean and Green Task Force in 2015. She holds a PhD in Physics from Harvard University and an MS in Aerospace Engineering from Stanford University.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Pruitt’s EPA Undermines Cellulosic Biofuels and Transparency in Government

As the New York Times recently reported, the EPA Administrator Pruitt has been conducting much of his work to undermine the EPA’s mission in secret.  The recent proposed rule implementing the Renewable Fuels Standard (RFS) provides yet another example of the Pruitt EPA taking decisions that should be based on evidence, and substituting their political preferences instead.  The RFS is a complex and controversial regulation that is currently in a rulemaking process that will put the Pruitt EPA’s first official stamp on biofuels policy.  Pruitt has made no secret of his distaste for the regulation, and his inclination to reduce burdens of any type on the oil industry, and his first efforts at RFS implementation shows how he puts that distaste into effect.

One of the striking elements of this year’s proposal was that the administration proposed that the standards for cellulosic biofuels in 2018 would be lower than those for 2017.  The cellulosic biofuel standards are a small part of the overall proposal, but the success of cellulosic biofuels is key to delivering better biofuels, as I described in a chapter of my recent report on Fueling a Clean Transportation Future.

Moreover, a declining standard is an odd reversal, given the clear intent of the RFS to promote the growth of cellulosic biofuels.  I was puzzled at how this conclusion was justified, and together with my colleague Alyssa Tsuchiya, we looked through the drafts and other documents that EPA posted in the docket to figure it out.

Digging through the docket

What we found is that the EPA’s original proposal transmitted to OMB on May 10th called for growth of the cellulosic standard, with a proposal for 384 million gallons based on the same methodology that had been in place the previous several years.  This was still the target in the June 14th draft, but a new version dated June 23rd, less than two weeks before the final proposal was signed by Administrator Pruitt, suddenly reversed course, with a new methodology and a proposal that would set the cellulosic standard for 2018 at 228 million gallons, a reduction compared to 2017.  This might seem like a small change, compared to an overall renewable fuels standard of more than 19 billion gallons, but it’s very consequential to companies that have made investments in cellulosic biofuel production on the understanding that policy decisions would reflect the evidence of industry growth.

We found arguments in earlier drafts that show that EPA staff knew perfectly well that the new methodology would be inaccurate and underrepresent cellulosic production.  But someone intervening in the OMB review process decided to lower the targets anyway.  Key passages of the argument defending the existing methodology were deleted, without seeking public comment on the points they raised.  I found no record of who gave the order to make this change, but responsibility lies with Administrator Pruitt, who signed the proposal. The key changes involve the assessment of how much cellulosic ethanol new facilities were likely to produce in 2018, and how much biogas would be used for transportation in 2018.

Cellulosic ethanol

The slow startup of the cellulosic ethanol industry has been the subject of much discussion and press over recent years, but one part of cellulosic scale-up has been an unexpected success.  The conversion of corn kernel fiber into ethanol at existing corn ethanol facilities has been successfully implemented at Quad Counties Corn Processors, which reached a million gallons of cumulative production in 2015, and five million in 2016.  Now this and related technologies have been licensed and are being adopted at additional facilities across the country, for example at two Pacific Ethanol facilities in Stockton and Madera California.

However, last-minute edits to the proposed RFS rule delete references to the success of this technology, and instead cherry pick some pessimistic data about the startup of unrelated cellulosic biofuels technology as the basis for an assumption that facilities adopting the corn kernel fiber technology will almost entirely fail to deliver meaningful production (specifically that they will produce at the first percentile of a distribution for probable production volumes).

From page 30 of the redlined June 23rd draft of the proposal, the following passage has been deleted.

Additionally, when reviewing the cellulosic biofuel production data from the final three months of 2015 and all of 2016 we find that facilities that convert corn kernel fiber to cellulosic ethanol at existing ethanol production facilities have generally over performed relative to our production estimates, while large stand-alone cellulosic biofuel production facilities have generally under performed. In 2018 we anticipate that the majority of the liquid cellulosic biofuel production will be from facilities converting corn kernel fiber to cellulosic ethanol at existing ethanol production facilities. We therefore believe it is prudent to continue to use our existing projection methodology rather than to adopt a new methodology that would result in lower production estimates as doing so could result in inappropriately low production projections for a commercially successful technology (corn kernel fiber conversion) based on historic scale-up difficulties at facilities using a largely unrelated technology. We believe that it is likely that, on a relative basis, the accuracy of our projection for 2018, using the same general methodology as in previous years, will increase as the overall production of cellulosic biofuel increases, and the proportion cellulosic biofuel expected to be produced using technologies that are currently being used to successfully produce cellulosic biofuel on a commercial scale increases. [Footnote 51]

 [Footnote 51] 89 percent of all expected cellulosic biofuel production in 2018 is expected to come from CNG/LNG derived from biogas and corn kernel fiber conversion technologies. Both these technologies have been successfully used to produce consistent volumes of cellulosic biofuel since 2014.  

In place of this defense of the existing methodology is a new argument, that since new facilities starting up in 2016 substantially underperformed EPA’s expectations in 2015, it is logical to assume new facilities starting up in 2018 will under-perform by the same amount.  The proposal assumes that new facilities with the potential to produce more than 100 million gallons in 2018 will instead produce 1 million gallons.  This would be a questionable assumption for any part of the cellulosic biofuel industry, especially given ongoing industry learning between 2016 and 2018, but it is especially egregious as an assumption for the corn kernel fiber technology, which has built a very successful track record.

In the 2016 standard, which is cited as the basis for this decision, more than 80% of the liquid cellulosic biofuel from new facilities was projected to come from three large stand-alone facilities converting corn stover to ethanol, while in the 2018 proposal more than 75% of the projected cellulosic production from new facilities is coming from corn kernel fiber technology.  This technology is being introduced into existing ethanol facilities, that have a track record of production, so the technical hurdles to overcome are much lower. It makes no sense to discount their potential based on the struggles of the three large standalone facilities using unrelated technologies, as is explained in another deleted passed from page 31 of the redlined June 23rd draft of the proposal.

We do not believe it would be reasonable to establish a methodology where the success or failure of a small group of companies, and in some cases a single company, would have a dramatic impact on the methodology used to project volumes from other companies the following year, especially where the methodology overall has been demonstrably successful.


The other rapidly growing part of the non-food-based cellulosic biofuel category is biogas, methane captured from landfills and other sources and used to replace fossil natural gas as a transportation fuel in heavy duty vehicles.  For more on biogas, also called biomethane or renewable natural gas, see our recent fact sheet on The Promises and Limits of Biomethane as a Transportation Fuel.

The June 14th proposal projects that biogas will provide 341 million gallons in 2018, based on solid performance (93% or capacity) from consistent existing producers with capacity of 265 million gallons equivalent, and new producers operating at about 50% of production capacity of 189 million gallons equivalent.  In place of this assessment, a new methodology is proposed that ignores detailed consideration of new production capacity, and instead uses growth between the first five month of 2016 and the first five months of 2017 to predict a rate of growth for the industry as a whole.  This leads to a projection of 9.3% annual growth and a proposal or 221 million gallons equivalent, 35% lower than the proposal in the earlier draft based on the existing methodology.

Based on the existing methodology, EPA staff found that new production capacity would increase potential production by 40%, but the new proposal effectively assumes that this new capacity will not be required, and that actual usage will be well below what existing facilities could manage.

Assuming investment in cellulosic fuels will fail

Consistent with the New York Times story, I found no record of who instructed the EPA staff to make these changes, or their motivation, but given the clear antipathy of Administrator Pruitt and the oil industry for the Renewable Fuels Standard, it is not surprising to see the Pruitt EPA act to undermine the component of the Renewable Fuels Industry most important to future growth.  It would be nice to think that time will prove this pessimistic perspective wrong, but there is a very real risk that pessimism embodied in the administration of the RFS becomes a self-fulfilling prophesy.

The RFS is often described as requiring oil companies to buy ethanol, but the actual mechanics of the policy involve tradable credits, called Renewable Identification Numbers.  Unrealistically pessimistic estimates of next year’s production ensure that credit buyers (oil companies) will have the upper hand in negotiations with credit sellers (cellulosic biofuel companies).

Artificially depressed prices for these credits will make further investment in this sector less appealing.  Moreover, the Administrator announced in the proposal his intention to use the reset provisions of the RFS to revise all the future standards, through at least 2022, and potentially beyond.  I have argued for several years that updated targets that are ambitious but realistic are sorely needed.  But if the administrator puts his thumb on the scale in favor of the oil industry, the reset could undermine the support the RFS provides for cellulosic biofuels for years to come, in clear contradiction of the stated goals of the RFS.

Important changes in policy should be explained to the public

The RFS is a complex and controversial policy that is challenging for EPA to administer under the best circumstances.  The stakes are high for the oil industry, the biofuel industry and the environment.  In this context, it is more important than ever to explain the rational for changes in direction, but instead Pruitt’s EPA is operating without proper transparency or even the simple expectation that you need to show your work.

In a recent radio interview in Iowa, Administrator Pruitt said that the RFS rule should be based on objective criteria, not “blue-sky thinking. ” But the proposal ignores objective criteria, and relies on distorted logic to forecast thunderstorms, despite the clear evidence of progress on cellulosic fuels.

The proposal ignores the proven track record and investment in corn kernel fiber ethanol and biogas, and relies instead on slippery math to lower the targets for cellulosic biofuels.  Energy Dominance, as the Trump Administration defines it, ignores the clear progress of renewable fuels and instead doubles down on the failed fossil fuels of the last century.  Administrator Pruitt often talks about following the law, but the proposal just throws out inconvenient evidence and substitutes the answer the administrator prefers.  Administrator Pruitt’s job is to implement the law as Congress wrote it, regardless of his preferences or those of his friends in the oil industry.

Halting a National Academy of Sciences Study Is Unacceptable

I have been a participant in National Academy studies as a scientist, a recipient of their advice as a federal agency manager at NOAA, and involved in setting up studies as a board member. Last week was the first time that I have ever seen a study in progress halted.

On August 18, the Department of Interior halted a study under the auspices of the National Academies of Sciences, Engineering and Medicine, Division of Earth and Life Studies, entitled, “Potential Human Health Effects of Surface Coal Mining Operations in Central Appalachia.” The study was originally requested by states in Appalachia concerned about the health of their citizens. The agreed terms of reference for the study are here.

The National Academies of Science, Engineering and Medicine, first established by President Lincoln, are the nation’s premier institutions for advising our government and our people on the state of scientific evidence concerning key issues facing the nation. Members are elected by their peers to the Academies based on their deep experience and accomplishments across a wide range of fields in science, engineering and medicine.

Academy studies are commissioned to bring together, review and evaluate the scientific evidence on critical, and often controversial issues. The panel of scientists who perform a study are drawn from the broader scientific community. Each study is overseen by one or more academy members as well as professional staff to ensure it is of the highest quality. And the first step in every study I am aware of is an open discussion of possible conflicts of interest on the study panel to ensure transparency and independence.

While reviewing its grants, the Department of Interior decided to halt the study due to the “changing budget situation”. Given that the budget for the current fiscal year is set under the continuing resolution agreed in May, and the situation hasn’t changed, that seems an odd reason to quash this important work. It is also noteworthy that one of the first rules overturned by Congress and the Trump Administration using the Congressional Review Act was the so-called Stream Protection Rule enacted under President Obama. The rule was overturned despite clear scientific evidence that disposing of mine waste in streams alters their chemical and biological characteristics. But while agencies must base rules on scientific analysis, Congress is not required to do so. Their decision was purely political at the request of the mining industry.

By halting the National Academy study, the Department seems to be saying that it is not worth even looking at possible public health impacts of surface mining operations.  A representative from an industry group, the Virginia Coal and Energy Alliance, was quoted as saying, “We feel our health problems are the result of heredity and are our poor personal habits and choices, not the industry….” He may well feel that way, but science is about evidence, not feeling.

The whole point of a National Academy study is to better understand the scientific evidence concerning critical issues such as public health impacts of a given type of activity. NAS Study reports are designed to advise policymakers as to what the science says about controversial issues. That advice is often used to make better science-based policies to protect public health and safety.

The marginal costs are small for such an in-depth review of the science. Study panel members are not compensated, and they bring enormous expertise to questions like this. Halting a study charged with identifying the direct and indirect public health impacts of surface mining implies that we don’t want to know the answer, but people’s lives and quality of life are literally at stake. Members of the Appalachian communities in Kentucky agree and say they want the results. “Science isn’t going to hurt us. What we don’t know very well could,” said Dee Davis, president of the Center for Rural Strategies in Whitesburg, KY.

Scientists and community members in Tennessee, Kentucky, Virginia, and West Virginia can reach out to their members of Congress and urge them to tell Secretary Zinke to reinstate the study. But this is an issue not just for Appalachia and its residents, but for all of us. We cannot silently sit by while critical studies of public health are shelved because the answer might be inconvenient to one or more political interests. Secretary of Interior Zinke should immediately release the funds and let the study proceed.

Warehouses As an Environmental Justice Issue

Photo: Atomic Hot Links/CC BY-NC-ND 2.0 (Flickr)

When we think of locally undesirable land uses, we often think of large power plants, puffing single plumes of pollution. But many plumes of pollution from trucks traveling to and from warehouses can have equally large impacts on health. 40% of US imports enter through the ports of Los Angeles and Long Beach. Trucks travel frequently to deliver the goods to warehouses, and further move the goods from those facilities to more customers. In the era of e-commerce, high demand for express deliveries further contributes to the massive expansion of the warehousing industry.

As an Angeleno commuter, I am deeply impressed that a large number of giant warehousing facilities emerge in the suburbs along the Interstate 10 when I drive to work. But what do these facilities bring to our communities besides consumer goods?

The significant expansion of the warehousing industry

Figure 1 Percentage changes compared to the Year 2003 in the number of establishments in selected industry sectors (Data sources: County Business Pattern 2003-2015)

Over the last decade or so, the warehousing industry has expanded substantially, especially compared to the other industry sectors. In the Los Angeles Metropolitan Area, the number of warehouses and storage facilities increased by 21% between 2003 and 2015 (see Figure 1). However, during the same period, the construction sector got a 9% increase, wholesale and retail generally remained the same, and the manufacturing sector experienced a 23% plunge. While these traditional sectors in the economy stagnate, the warehousing industry becomes a star that is experiencing continued prosperity in the recent decade.

Figure 2 Number of establishments in warehousing and storage industry in the largest eight metropolitan areas in the U.S. (Data sources: County Business Pattern 2003-2015)

Expansion of the logistics industry isn’t limited to Los Angeles. Among the largest eight metropolitan areas in the US, the number of warehousing establishments increased by at least 20% in six of them: Los Angeles, Chicago, Dallas, Houston, Philadelphia and Miami (see Figure 2). The growth rate in Houston reached as high as 40%. The spatial expansion of warehouses is especially dramatic in metropolitan areas with abundant cheap suburban land. Warehousing developers favor this type of land as it offers many conveniences for warehousing development: low rent, large parcels, weak regulations, and good regional connections.

What impacts can warehouses have on communities?

The increased number of warehousing facilities not only consume large tracts of land, but also bring about substantial environmental externalities. Freight trucks generate air pollutants, noise, pavement damage, and traffic safety threats while moving into and out of warehouses.

According to studies in public health and traffic engineering, a truck creates significantly higher environmental impacts than a passenger vehicle. The exposure of local residents, especially children and elderly people, to truck related emissions like NOX and particulate matter would cause health outcomes including asthma and respiratory allergies.

A street view in the City of Carson where trucks (right) occupy all road lanes next to a residential neighborhood (left) (Photo: Quan Yuan)

Roads filled with semi-trucks are a familiar sight in areas and neighborhoods with warehouses. It suggests the great impacts that frequent truck movement could have on the local communities. More and more residents are becoming aware of these externalities associated with warehousing activities. Some of them have organized to fight against the siting of new warehousing projects. For instance, the World Logistics Center, a major warehousing project under review in the City of Moreno Valley, is opposed by local resident groups, environmental advocates, and public agencies including the South Coast Air Quality Management District. This huge project, with floor space totaling around 40 million square feet, rouses concerns about the environmental risks associated with substantial truck movement.

Do some neighborhoods receive more warehousing facilities than others?

Figure 3 Spatial distribution of warehouses and two selected types of neighborhoods in the Los Angeles region (Date sources: Costar, Inc.; American Community Survey 2010)

Given that warehousing facilities are regarded as locally undesirable, an important question arises: are they disproportionately distributed? Unfortunately, the answer is yes. My recent analysis of warehousing location in Los Angeles revealed that low-income and medium-income minority neighborhoods contain a vast majority of warehouses and distribution centers (see Figure 3). Apart from traditional industrial clusters in the East LA and Gateway cities, suburban neighborhoods in the Inland Empire are rising hotspots for warehousing development. Econometric model results confirm the spatial patterns that minority neighborhoods receive significantly more warehouses than white neighborhoods, after controlling for household income, land rent and many other variables. The empirical evidence implies a classic environmental justice problem.

But why? Warehousing developers search for locations with low land rent, low-wage labor pool, weak political power, and favorable public policies. Economic, sociopolitical and institutional factors are equally important in the dynamics. When local authorities are indifferent about warehousing development, minority residents may not be able to resist this spatial inequity, or unequal spatial distribution of warehouses.

This environmental justice problem is drawing the attention of the public, academia, and policy makers. Land use regulations, environmental standards, vehicle fleet upgrades, and techniques (such as using plants as buffers) are all potential options for alleviating the problem. As warehouse development continues to increase, let’s take seriously this environmental justice issue, and come up with feasible solutions that stop burdening our minority communities with air pollution.

Quan Yuan is a Ph.D. candidate in Planning and Policy Development at Sol Price School of Public Policy, the University of Southern California. His research interests mainly lie in urban transportation planning, freight, parking, and environmental sustainability.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Comments Needed Now! The Trump Administration Might Revoke Vital Beryllium Protections

Quick recap: I have previously written (here, here) about OSHA’s efforts to delay implementation of its final, protective standard for workers exposed to beryllium. The delays follow decades of work, a lengthy rule-making process, and solid scientific evidence. Despite this, in June and in response to pressure from the construction and shipyard industries, OSHA decided to (once again) solicit stakeholder comments on whether its final beryllium rule should extend protections to workers in these two industries. (Like there hadn’t already been ample time and opportunity—but I won’t go there.)

SOS! The comment period closes on Monday, Aug 28th.

As many workers and families know all too well, beryllium is a very dangerous material. It’s a carcinogen and the cause of chronic beryllium disease, a devastating illness. There’s no real rescue from this slow, incurable, and often fatal lung disease.

Please take a minute to send in a comment and let OSHA know that weakening this protection for construction and shipyard workers IS NOT OK. They should not do it. These workers need and deserve this protection.

Comments must be received by Monday, August 28, at 11:59 pm. Here is information on how to submit your comment.

You must include the line below that lists the agency name and the docket number.  (You can put that in the RE: line).

I am submitting this comment on OSHA Docket No. OSHA-H005C-2006-0870, Occupational Exposure to Beryllium and Beryllium Compounds in Construction and Shipyard Sectors.

To submit your comment, go here and click on the “Comment Now” box in the upper right corner. You can also fax the OSHA Docket Office: 202-693-1648

Here is some possible language, though I encourage you to add and make it your own.

Docket Office
Occupational Safety and Health Administration
Docket No. OSHA- H005C-2006-00870
Room N-3653
U.S. Department of Labor
200 Constitution Avenue NW
Washington, DC 20210

I strongly oppose every provision of OSHA’s new proposal that revokes the ancillary provisions for the construction and ship yard sectors that OSHA had already adopted on January 9, 2017. The Agency must not revoke any of the provisions promulgated in the final rules on January 9, and they must assure that the full standards are implemented as published. Adopting any of the provision in this new proposal would lead to more death and disease among exposed shipyard and construction workers. I strongly oppose this proposal and its mission to create a two tiered system of protection for workers exposed to beryllium. OSHA must move forward and implement the rules as promulgated.


Your Name Here

Sign and date

Public comment is a critical component of our democracy. Please take a moment to weigh in on this one. And remind OSHA that their FIRST (and statutory) priority is the protection of workers’ health and safety, not the protection and preference of the industry in question.