Combined UCS Blogs

The Senate Should Oppose the New Low-Yield Trident Warhead

UCS Blog - All Things Nuclear (text only) -

This week, the Senate Armed Services Committee will take its turn to mark up the FY 2019 National Defense Authorization Act (NDAA). This also gives it an opportunity to weigh in on the Trump administration’s proposal for a new, lower-yield warhead for the Trident D5 submarine-launched ballistic missile (SLBM), funding for which is included in the bill.

The new warhead, designated the W76-2, will reportedly have a yield of 6.5 kilotons and would replace some of the W76 warheads currently on the Trident missiles, which have a yield of 100 kilotons.

The NDAA as it is now written would authorize $88 million in spending for the new warhead: $65 million from the Department of Energy’s National Nuclear Security Administration’s budget and $23 million in Department of Defense funds. The House Armed Services Committee earlier this month voted along party lines to reject an amendment that would have eliminated funding for the program from its version of the bill.

Despite the administration’s rhetoric about the need to strengthen deterrence, there is no good reason to develop a new warhead. As the head of the US Strategic Command, General Hyten, said himself in Congressional testimony earlier this year, “I have everything I need today to deter Russia from doing anything against the United States of America.” Worse, as many experts have pointed out, the new warhead could cause confusion for Russia and potentially increase the chances of miscalculation leading to an escalating nuclear exchange. Former Secretary of Defense William Perry has called such low-yield weapons “a gateway to a nuclear catastrophe.”

Opposition to this new program may be stronger in the Senate than in the House. It is certainly ripe for debate, given the dangers it presents and the questionable rationale the administration has put forward for it. To help make the case, more than twenty NGOs sent a letter to Senate Majority Leader Mitch McConnell that lays out the arguments against a new lower-yield Trident warhead.

It is unlikely that the Senate, in its current configuration, will stop the program, but it is important at the very least to ask the relevant questions about why we need such a weapon (we don’t) and how it would really affect US security (by decreasing it).

If You Can’t Censor It, Bury It: DOI Tries to Make a Stark New Study on Rising Seas Invisible

UCS Blog - The Equation (text only) -

Cape Lookout National Seashore, North Carolina. Photo: NPS

A new National Park Service (NPS) report is unequivocal that human-caused climate change has significantly increased the rate of sea level rise that is putting coastal sites at risk. But the study is difficult to find on the web and the report’s lead author, Maria Caffrey of the University of Colorado, says she had to fight to keep many scientific statements about climate change in the final version.

The report, Sea level Rise and Storm Surge Projections for the National Park Service, was published late on Friday May 18th, with no official announcement or accompanying press release – indeed, no easy way to find it unless you know where to look (hint: it’s here…tell your friends). The report has been several years in the making, and was delayed for several weeks after a draft showing edits removing mentions of human-driven climate change emerged and was reported in The Reveal. In the wake of these revelations, Department of Interior (DOI) Secretary Ryan Zinke was questioned about the changes by House Democrats Chellie Pingree (Maine) and Betty McCollum (Minnesota) in a House Appropriations subcommittee soon after the controversy broke in April. Responding to a question about the report by Pingree, Zinke responded: “If it’s a scientific report, I’m not going to change a comma.”

Since then, the references to human-caused climate change and climate attribution that had been proposed for deletion, have been restored. What we now have in the public domain at last, is a hugely important and detailed analysis of how projected future sea levels and storm surges may impact 118 US national parks. The findings are quite dramatic.

Dozens of US parks at risk from flooding and inundation

The report identifies dozens of famous and iconic sites including Virginia’s Historic Jamestowne and Assateague Island, Big Thicket National Preserve in Texas, the Florida Everglades and Jean Lafitte National Historic Park in New Orleans, as especially vulnerable. Several of the sites at risk were also identified by the Union of Concerned Scientists (UCS) in its 2014 report “Landmarks at Risk”, which built on previous NPS climate impacts research. Nationally, the new analysis shows that the highest average rate of sea level change by 2100 is projected for the National Capital Region, which puts sites on the Potomac River, and in and around the National Mall at risk.

Simulation of flooding from a category three hurricane striking Theodore Roosevelt Island, Washington DC. Credit: NPS

The highest total sea level rise by the end of the century is expected to be seen on coastline of the Outer Banks, threatening Wright Brothers National Memorial, Fort Raleigh and Cape Hatteras, and the broader Southeast Region is expected see the highest storm surges in the future. National parks on Caribbean and Pacific islands are at risk too, including in Puerto Rico and the US territories of Guam, American Samoa and the US Virgin Islands.

Parks must plan for worse storms & floods

Using Intergovernmental Panel on Climate Change (IPCC) sea level rise scenarios and National Oceanic and Atmospheric Administration (NOAA) data, the report also looks at how increased rates of sea level rise will interact with increasing hurricane intensity to worsen storm surges. When Hurricane Sandy hit the East Coast in 2012, storm surge caused widespread flooding throughout the region. That larger storm surge rode in on seas about 12 inches higher than in the pre-industrial period due primarily to warming oceans and melting land ice. Further analysis found that sea level rise added $2 billion to the damages from Hurricane Sandy in New York City. According to the NPS, Hurricane Sandy in 2012 caused in excess of $370 million in damage to national parks. The costs of 2017’s hurricanes Harvey, Irma & Maria to America’s parks have not yet been fully tallied, but will be large.

The authors of the new report recommend that because of the likely intensification of hurricanes, park managers should base planning on impacts likely from storms at least one storm category higher than any storm that has previously hit their particular park unit. According to the report “When this change in storm intensity (and therefore, storm surge) is combined with sea level rise, we expect to see increased coastal flooding, the permanent loss of land across much of the United States coastline, and in some locations, a much shorter return interval of flooding”. A suite of detailed storm surge maps for 54 sites has been posted on the NPS Coastal Adaptation page on Flickr.

Flood projection for a category 3 hurricane at high tide, Boston Harbor Islands, Massachusetts. Credit: NPS

A win for science and scientific integrity. This time.

The new NPS sea level rise analysis and storm surge maps represent a huge leap forward in terms of the tools that park managers, especially in some of the more remote locations, have available to them to assess the vulnerability of sites, and prioritize planning for resilience. It builds on a growing body of policy- and management-relevant climate science that the NPS’s Climate Change Response Program has been developing over the last decade. This work continues to keep the US at the cutting edge of international efforts to understand and manage climate impacts on cultural and natural heritage, and protected areas. It’s a pity that the DOI seems to be doing everything it can to make this report invisible, and that some of the climate scientists involved had to fight so hard to maintain the scientific integrity of their work.

After the study was published, report author Maria Caffrey, told journalist Elizabeth Shogren, the fight will have been  “worth it if we can uphold the truth and ensure that scientific integrity of other scientists won’t be challenged so easily in the future…”.  For the sake of our treasured national parks, and the dedicated staff who look after them, let’s all say Amen to that.

Did EPA Consult With The Chemical Industry While Working To Suppress A Scientific Study On PFAS?

UCS Blog - The Equation (text only) -

Today, members of the House Committee on Energy and Commerce sent a letter to EPA requesting more information about a meeting with an industry trade group, the American Chemistry Council (ACC), attended by Richard Yamada, the Deputy Assistant Administrator for the Office of Research and Development.

The letter and subsequent reporting (paywalled) is based on additional documents obtained by the Union of Concerned Scientists through a Freedom of Information Act request last month. EPA subsequently took down those documents, in an action similar to what happened with some of our other public records requests.

POLITICO reports:

Top House Democrats are raising concerns about a meeting between one of EPA Administrator Scott Pruitt’s top aides and representatives of the chemicals industry one day after a White House official raised alarm about a study of contaminants that has been stalled for months.

The American Chemistry Council represents companies that could face more expensive cleanup requirements if the HHS study were finalized, and the trade group appears to have had the ear of a top EPA official when it was being discussed internally, the House Democrats said.

A meeting titled “ACC Cross-Agency PFAS Effort” appears on the Jan. 31 calendar for Richard Yamada, EPA’s deputy assistant administrator for research and development. The calendar was obtained by the Union of Concerned Scientists under the Freedom of Information Act and cited by the Democrats in their letter to Pruitt Monday. One day earlier, Yamada and other EPA officials had received an email from the White House seeking to delay publication of the health study poised for release by HHS that would have increased warnings about certain PFAS chemicals.

A former staffer for the anti-science chairman of the House Committee on Science, Space, and Technology, Yamada attended a meeting with the ACC to discuss EPA’s cross-agency efforts to address PFAS. As we chronicled in 2015, the ACC has a history of obstructing stronger science-based public health protections from harmful chemicals and have frequently used tobacco industry tactics to pressure policymakers. An ACC spokesman confirmed the meeting with POLITICO but said that the suppressed PFAS study (also discovered by a UCS public records request) was not discussed.

The meeting, which occurred on January 31, was held the day after the now infamous “public relations nightmare” email was sent by an unnamed White House staffer.

The letter from members of the House Energy and Commerce Committee is the latest in a string of oversight letters related to the potential suppression by the White House and EPA of a key health assessment that is being conducted by the Agency for Toxic Substances and Disease Registry. Late last week, Representatives Brendan F. Boyle and Brian K. Fitzpatrick led another bipartisan letter demanding the release of the ATSDR study on the human health effects of PFAS chemicals.

Tomorrow, EPA is convening a national summit to discuss PFAS and the issues that states and communities are facing around the country. Unsurprisingly, one of the scheduled speakers is Jessica Bowman, an ACC attorney, who is talking first thing in the morning. And before a story in The Intercept, EPA failed to invite any community organizations and/or members to attend. After the reporting however, EPA has invited Andrea Amico, founder of Testing for Pease.

It remains unclear whether press will be able to attend, and according to the summit website, it appears as though the public can only view parts of the meeting online. Hopefully though, the agency will use tomorrow’s meeting as an opportunity to commit vital resources and concrete next steps to help remove these toxic chemicals from our environment.

Shareholders Not Playing Games at Big Oil Annual General Meetings

UCS Blog - The Equation (text only) -

Major fossil fuel producers are holding their annual general meetings (AGMs) this month amid mounting pressure from investors, increasing risks of legal liability for climate damages, and heightened scrutiny of their lobbying and public policy advocacy. BP and Royal Dutch Shell host their AGMs this week; ExxonMobil and Chevron will follow next week.

If shareholder meetings were classic game shows, and investors were keeping score, fossil fuel companies would be coming up short.

Investors demand Truth or Consequences about a 2°C world

Investors with a combined total of more than $10 trillion in assets under management are demanding that major oil and gas companies demonstrate support for the Paris Climate Agreement. In an open letter published in the Financial Times, fund managers including Aberdeen Standard Investments, BNP Paribas Asset Management, Fidelity International, and HSBC Global Asset Management Ltd. urged companies to be more transparent about how they are planning for a world in which global temperature increase is kept well below 2 degrees Celsius (2°C).

Specifically, the signatories called on oil and gas companies to make tangible commitments to reduce their carbon emissions substantially, consider the impact of emissions from the use of their products, and clarify how their investments are consistent with a 2°C world. (Check out this new report by Carbon Tracker and my recent blog highlighting these and other unanswered questions in ExxonMobil’s and Chevron’s reports on their plans for a 2°C world).

At tomorrow’s AGM, Shell faces a shareholder resolution by the Dutch organization Follow This calling on the company to set and publish targets that are aligned with the Paris Climate Agreement’s well below 2°C goal. Similar resolutions have received small but growing support from Shell shareholders over the past two years.

The United Kingdom responsible investment charity ShareAction notes that Shell’s announced ambition to reduce its carbon footprint is not a target, is not aligned with the goals of the Paris Climate Agreement, and would allow the company’s absolute emissions to increase. ShareAction therefore encourages investors who publicly support the Paris Climate Agreement to vote in favor of the Follow This resolution.

Although BP shareholders did not vote on any climate-related proposals this year, global warming was nonetheless a hot topic at the company’s AGM. As I observed in a Twitter thread about BP’s annual report, the company is wisely investing in renewables like wind and solar—but it is also upping natural gas production, and its strategy does not align with the Paris Climate Agreement’s well below 2°C goal or meet the expectations set out in the investor open letter.

(Legal) Jeopardy: Climate liability lawsuits gain momentum

One thing BP failed to mention in its annual report was the rising tide of climate liability litigation. (King County, Washington, filed the most recent lawsuit against BP and other major fossil fuel producers, seeking to hold them accountable for ”knowingly contributing to climate disruptions” and putting residents “at greater risk of floods, landslides, ocean acidification, sea level rise, and other impacts.”)

BP’s omission is particularly glaring in light of the recommendations issued last year by the Task Force on Climate-Related Financial Disclosures (TCFD) calling for consistent, comparable, and timely disclosures of climate-related risks and opportunities in public financial filings. In stark contrast, several of BP’s peers—including Shell, ConocoPhillips, and Peabody Energy—explicitly mentioned lawsuits filed by US municipalities as a shareholder risk. Peabody Energy did so despite a court ruling (appealed by the plaintiffs) that the company is shielded from liability by its bankruptcy filing. Shell may be sued in the Netherlands if it fails to align its business model with the Paris Climate Agreement.

This week, a federal judge in California will hear oral arguments in the fossil fuel companies’ motion to dismiss the lawsuit brought by San Francisco and Oakland over sea level rise brought on by climate change. Communities across the country and around the world that are struggling with enormous costs of climate damages and adaptation will be closely watching the ruling in this case.

Let’s [Not] Make a Deal with industry groups on climate change

There have been significant advances in fossil fuel company transparency about climate lobbying since last year’s AGMs. ConocoPhillips has expanded its disclosures of lobbying and other public policy advocacy following dialogues led by Walden Asset Management, and in response to shareholder resolutions that won substantial support in recent years—as well as recommendations in UCS’s Climate Accountability Scorecard. Valuable updates to the company’s website include an explanation of board and senior management oversight of lobbying, details on lobbying priorities and grassroots lobbying, and easily accessible information on lobbying expenditures.

This move by ConocoPhillips followed a report by BHP Billiton Limited on the material differences between the company’s positions on climate and energy policy and the advocacy positions on climate and energy policy taken by industry associations to which it belongs. BHP took action based on its review, severing ties with the World Coal Association. ConocoPhillips, BP, and Shell have all left the American Legislative Exchange Council (ALEC) in recent years, with Shell explicitly citing ALEC’s stance on climate science as the reason for its departure.

However, inconsistency between major oil and gas companies’ stated positions on climate change and those taken by their trade and industry groups remains a serious concern for investors. BP, Chevron, ConocoPhillips, ExxonMobil, and Shell all maintain leadership positions in trade associations and other industry groups that spread disinformation on climate change.

  • All five companies are represented on the board of the American Petroleum Institute (API)—notorious for its 1998 memo outlining a roadmap for climate deception, claiming that “victory will be achieved when…average citizens ‘understand’ (recognize) uncertainties in climate science.” API was warned about global warming as early as 1959.

  • BP, ConocoPhillips, ExxonMobil, and Shell are represented on the board of the National Association of Manufacturers (NAM), and Chevron is a NAM member. The NAM website is virtually silent on climate change, downplaying the issue even as NAM’s Manufacturers’ Accountability Project (MAP) attacks climate scientists and communities aiming to hold fossil fuel companies accountable for climate damages attributable to their business.

Following last week’s ConocoPhillips AGM, I received a vague response to a question I posed about what the company is doing to ensure that the public policy positions of NAM, API, and the US Chamber of Commerce are aligned with its own and environmentally responsible. To be a leader on transparency, ConocoPhillips ought to provide examples of what the company considers a “reasonable compromise” with industry groups on climate change—and explain how such dealmaking squares with its own climate policy priorities.

Shareholders are raising similar questions with BP and Shell about their leadership roles in NAM, API, and the Western States Petroleum Association (WSPA).

And next week, Chevron and ExxonMobil shareholders will vote on proposals for annual reporting on direct and indirect lobbying and grassroots lobbying communications. The filers of the ExxonMobil resolution, led by the United Steelworkers of America, are urging shareholders to vote yes. The proponents highlight a “Trade Association Blind Spot,” pointing out that ExxonMobil does not disclose its trade association memberships, trade association payments, or the portions of those payments used for lobbying. Arguing in favor of the resolution, the 26 co-sponsors “remain concerned that inadequate state lobbying and trade association disclosure by ExxonMobil presents significant risks to the corporation’s reputation.”

I always hated The Gong Show, but it would be tremendously satisfying to see the public and investors gong Big Oil CEOs off the stage if they continue their pathetic performance on climate change—failure to plan for a carbon-constrained world, failure to disclose climate risk, failure to renounce climate deception.

©corlaffra/Shutterstock.com

Closing North Korea’s Nuclear Test Site

UCS Blog - All Things Nuclear (text only) -

Of the surprising announcements North Korea has made in recent weeks, one of the most surprising was its statement that it would not only end nuclear tests but shut down its nuclear test site with international observers watching.

What should we make of this?

Pyongyang said it would allow journalists from the United States, Russia, Britain, and South Korea to watch the destruction of the tunnels at Punggye-ri sometime in the coming week (May 23-25). These tunnels dug into the mountain are where North Korea conducts its nuclear tests. US intelligence says that North Korea is already dismantling the test site, and satellite photos of the site (here and here) confirm that a number of facilities at the site have already been torn down.

Punggye-ri Test Site (Source: Google Earth)

If North Korean leader Kim Jong-un is serious about limiting and perhaps eventually eliminating his nuclear and missile capabilities in return for economic engagement with the outside world, the question is how he demonstrates that seriousness. Publicly shutting down his test site is a meaningful step in the right direction and an interesting way to try to send that message.

It’s true that shutting down the Punggye-ri test site does not prevent North Korea from ever testing again. If negotiations fail or situations change in the future, it could decide to tunnel at a different site and build the required infrastructure needed to test. But it’s a meaningful and pretty dramatic action nonetheless.

For one thing, while part of the current test site is no longer usable because some tunnels collapsed after previous tests, experts agree that a couple tunnels at the site remain usable. They also agree that disabling the facilities would take time to reverse—perhaps months or longer.

This reminds me of North Korea’s decision in 2008 to disable its nuclear reactor at Yongbyon by  blowing up the cooling tower and letting foreign reporters film the event. This was at a time when negotiations with the United States seemed to be moving ahead. A few years later after negotiations had stalled, Pyongyang built a new cooling system and was able to restart the reactor. But disabling the reactor was still a meaningful action, since it kept the reactor from operating for several years.

What’s next?

North Korea’s statements last week raised the possibility that Kim was walking back his various offers. Yet Kim’s criticism was focused on statements by John Bolton and others about the need for the North to denuclearize as an early step of negotiations. This is an approach Pyongyang has consistently rejected, calling instead for a step-by-step process that helps build the trust needed for additional steps.

President Trump’s subsequent statement disavowing this so-called “Libyan model” of disarmament seemed intended to help repair the situation, but his later statement that appeared to threaten destruction of North Korea if talks failed could have exactly the opposite effect and lead Kim to cancel or delay the talks. In the meantime, China has urged Pyongyang to continue with the talks.

So whether or not the summit will proceed as planned remains uncertain. An important indicator will be whether North Korea goes ahead with destroying tunnels at its test site this week.

Wait—Offshore Wind Offers HOW Much Power? Use This Calculator…

UCS Blog - The Equation (text only) -

Credit: Derrick Z. Jackson

Almost every week is bringing news about another step forward somewhere in the country for America’s newest renewable energy, offshore wind. Increasingly, the news is about advances for specific projects off our shores.

But when we hear about an offshore wind project of a certain size—X hundred megawatts—what does that mean? What does it mean in terms of our electricity needs, for example, or our need to cut pollution, or our potential to do more?

A simple new calculator from the Union of Concerned Scientists can help you size up each offshore wind project.

What Would an Offshore Wind Project Mean?

var divElement = document.getElementById('viz1526909389840'); var vizElement = divElement.getElementsByTagName('object')[0]; vizElement.style.width='420px';vizElement.style.height='1140px'; var scriptElement = document.createElement('script'); scriptElement.src = 'https://public.tableau.com/javascripts/api/viz_v1.js'; vizElement.parentNode.insertBefore(scriptElement, vizElement);

The inputs

Here’s the deal: When you hear about a proposed offshore wind farm, the project size is likely to be expressed in terms of megawatts—its nominal capacity/power output, based on the rating of each wind turbine at a given wind speed.

Credit: J. Rogers

How many turbines that proposed project will involve depends on the capacity of each individual turbine (also expressed in megawatts). That math isn’t complicated.

How much electricity an offshore wind project’ll generate is a little more complicated, depending mostly on where the turbines will be—what kind of wind resource the turbines will have access to. That’ll vary by state, and even within a given coastal area.

But with a few simplifying assumptions and estimates, you can get ballpark figures for what the project will mean in terms of the energy generation/production, the benefits it will provide such as avoided carbon emissions, and the area it will occupy.

The outputs

What the new tool offers when you put in those few inputs (state, project size, turbine size) can tell you something about what we’re likely to get out of the project you’re assessing.

Energy equivalent – The electricity expected from a project can be thought of in terms of the number of household equivalents it could power. Not actual households, since it takes a mix to make sure we’ve got power ‘round the clock, but how the energy produced matches up with the amount of electricity a typical household uses.

Average household electricity use varies by region and state, based on things like the climate and state energy efficiency efforts. So a given amount will go further in some places than in others.

Pollution reduction – And then there are the air quality benefits of projects. Megawatt-hours of offshore wind generation will displace megawatt-hours of generation from land-based power plants in the region. What an offshore wind electron displaces depends on what’s “on the margin” at a given moment—usually what next-cheapest power source doesn’t get turned on because offshore wind is doing its thing instead.

If those displaced sources are coal, oil, or natural gas power plants, which will often be the case, the offshore wind power will help us avoid the pollution that those plants would otherwise emit. Avoiding that pollution brings important health and environmental benefits.

This simple calculator focuses on carbon dioxide. And it puts the result in terms of number of car equivalents—what that CO2 pollution reduction would be like in terms of the carbon pollution that comes from a typical car in a typical year, given the US auto fleet and American driving habits.

Leases and lessees – some done, more to come (Source: BOEM 2018)

Lease area potential – In general, the areas most ready for offshore wind projects are in the existing federal leases on the US Outer Continental Shelf off our nation’s East Coast. The federal government, using robust public stakeholder processes (as in Massachusetts), identified various offshore wind lease areas. It auctioned off the leases, and a range of companies won the rights to put up turbines in those areas. There are more than a dozen such leases so far, from North Carolina up to Massachusetts. (And more are on the way.)

Given that, you can think about a project in terms of how much of that state’s existing lease area it’s likely to take up, and how much room it leaves for more offshore wind power.

Using the calculator

To ground all this in (projected) reality, here’s an example for you to try: Let’s imagine a 400 megawatt wind farm off Massachusetts (and at this point in the process that doesn’t require much imagination), and imagine 8-megawatt wind turbines. So:

  1. Click on Massachusetts on the calculator’s map.
  2. Use the sliders or right-left arrows to get to 400 megawatts for the project size.
  3. Pick 8 megawatts for the turbine size.
  4. Check out the results.
    • For number of turbines, you get 50.
    • For number of households whose total energy consumption would match what the project would produce, you’d get something like 230,000.
    • The avoided CO2 pollution would be equivalent to taking some 90,000 cars off the road.
  5. Check out how much—or how little—of the existing Massachusetts lease areas a project like that would use up: 6%.

At the bottom of this post are details about the calculator and calculations.

The scale of things to come: Offshore wind blades, and a sample of the people behind it all (Credit: Derrick Z. Jackson).

More results

Other results from new offshore wind are equally important, but harder to quantify simply at this early stage in the technology’s history in this country. Those include employment and ecosystem effects.

Jobs – A big reason for offshore wind power’s popularity right now is its tremendous potential for job creation, in manufacturing, project development, installation, maintenance, finance. Think welders, pipefitters, electricians, boat crews, and a whole lot more.

And the vision is not just jobs, but careers, as single projects pave the way for multiple tranches that then lead to a whole US offshore wind industry, one big enough to sustain not just projects but all the soup-to-nuts pieces that go along with that when the scale is big enough.

In Europe, the offshore wind industry is 75,000 workers strong. Estimates for US jobs draw on assumptions about how big the American market will get, and how quickly, and what that means for how many of the jobs end up here, instead of overseas. A 2015 US Department of Energy study found that going to 22,000 megawatts by 2030 could mean 80,000 American jobs by that year. A study for various Northeastern states looked at 4,000 to 8,000 megawatts of offshore wind development in the region, and projected full-time equivalent jobs in a given year of up to 36,000.

Proceed, but with caution (Credit: Derrick Z. Jackson).

Ecosystems – The results of an offshore wind farm in terms of our offshore ecosystems depend on the care taken in planning, siting, installing, operating—and, eventually, decommissioning—of the project. Offshore wind’s potential to cut carbon pollution can help reduce the impacts of climate change—including important ones for our oceans and marine ecosystems. But additional activity and infrastructure in the marine environment can have direct impacts that need careful consideration.

One concern is marine mammals, and particularly, on the Eastern seaboard, the critically endangered North Atlantic right whale. Project developers have to be careful to not add to the right whale’s troubles.

For fish, once a project is in place, the bases for the offshore wind towers can be problematic for some species, and a boon for others, as they can act as artificial reefs and change the environment.

Where jobs and fish come together is in the fishing fleet. Results, positive and negative, will depend on things like any limitations on boat travel in the project area during construction, and any boost to fish stocks from the project once it’s installed. While commercial fishers may view projects differently from how recreational ones do, at least some fishers are finding the US’s first offshore wind farm, off Rhode Island’s Block Island, to be a plus (and there’s this upbeat from the University of Delaware and the American Wind Energy Association).

Results in terms of jobs, careers, and our marine environment will be important to keep an eye on.

Technology and people (Credit: Derrick Z. Jackson)

Calculate on

In the meantime, there’s plenty we can know about with greater certainty. With the help of this simple calculator, the next time you hear of an X megawatt offshore wind project destined for a shore near you, you can let it be more than a single number. Look at what it means in terms of energy to be generated, pollution to be avoided, and lease area implicated.

To be clear: an offshore wind calculator is no substitute for the detailed wind monitoring, engineering calcs, environmental assessments, and much more that go into project proposals, investment decisions, and approval processes.

But this one just might help give some more depth for contemplating project announcements as the offshore wind industry takes off in the country. Because, beautiful as offshore wind farms seem to many of us, they’re a lot more than just a bunch of graceful kinetic sculptures.

The technical stuff

  • States – The calculator includes the eight (as of this writing) states for which the US government’s Bureau of Ocean Energy Management (BOEM) has auctioned off leases. South Carolina is working toward joining that club. Projects can also happen in state waters, as with the Icebreaker project planned for Lake Erie waters off Cleveland. The West Coast also has terrific resources, and even the Gulf Coast may get into the act at some point.

    The power on the seas (Source: NREL/Musial et al. 2016)

  • Capacity factors – To calculate electricity production, the calculator uses midpoint capacity factors from the different zones in NREL’s latest offshore wind resource potential assessment (Musial et al. 2016): Delaware (42.5%), Maryland (42.5%), Massachusetts (47.5%), New Jersey (45%), New York (45%), North Carolina (42.5%), Rhode Island (47.5%), and Virginia (42.5%).
  • Household equivalents – The calculator uses the latest figures from the US Energy Information Administration on average monthly electricity use by residential customers in the chosen state. Figures are rounded to the nearest thousand.
  • Avoided CO2 emissions – The calculator uses the average CO2 emission rate for each region, as calculated by the US EPA, and the car pollution figure from EPA’s own equivalencies calculator. Figures are rounded to the nearest thousand.
  • Project areas – Project footprint calculations are based on NREL’s assumption of 3 megawatts per square kilometer (Musial et al. 2016).
  • Lease areas – The lease area calculations for each state are based on the figures from BOEM here. For the two leases in the shared Rhode Island/Massachusetts offshore wind area, the calculator credits those amounts fully to each state; that is, it considers them to be Rhode Island’s when considering a Rhode Island project, and Massachusetts’s when looking at Massachusetts.
Photo by Derrick Z. Jackson Photo by Derrick Z. Jackson

Now Is the Time To Halt the EPA’s Restrictions on Science

UCS Blog - The Equation (text only) -

If you have been following the news, I am sure you know by now that the EPA is proposing to restrict the science it will consider when developing new or revised health and safety protections. It may seem like a Washington game, but this proposed rule has huge implications for all of us.

For scientists, it means that much of your work may be dismissed from impacting policy out of hand because you must adhere to research ethics policies that restrict the release of private data. Or because you can’t and shouldn’t sacrifice intellectual property rights at the whim of the EPA. For industry, it creates greater uncertainty around the always thorny issues concerning confidential business information. And, most importantly,  for all of us, the proposal means that policies that protect our health and safety will not be based on the best available science because of inappropriate political interference.

So what can YOU do to fight back? Well, for all the political manipulation that we have been documenting at the EPA, the agency must still adhere to the law when making or changing regulations.  That means the EPA must make a proposal public, accept public comments from all who wish to submit them, evaluate and respond to those comments, and then decide on the final version of the rule. And they are subject to challenge in federal court on all actions.

That means YOU can submit a comment into the public record that the EPA is obligated to consider. And now is the time! For this proposal, the comment period is only 30 days—and it’s already more than half over. It closes at the end of May (though requests have been made to extend it, so far with no response from the EPA).

How do I make a comment?

The proposed rule is complicated and somewhat confusing. It is misnamed as an action to “strengthen transparency” in the rulemaking process, but it does no such thing. To have an impact, however, your comment needs to be specific and detailed, not just broad comments on the rule.

To help you better understand the proposed rule, we have produced a guide for commenters. The guide highlights topics for which the EPA is specifically requesting input and some of the issues you may want to consider in making your comment. It also gives you the links for submitting a comment and some suggestions for how to have the most impact.

I want to encourage scientists to submit as part of their comments examples of specific important scientific studies and evidence that are likely to be excluded if this rule is implemented. For example, the rule proposal says that studies will only be considered if all raw data, computer code, models, and other material in the study is fully publicly available.

On its face, that precludes using studies where personal confidential information is part of the “raw” data. Most Institutional Review Boards require researchers maintain confidentiality for human subjects data. Are their studies you have been involved in or rely on in your research that would be excluded a priori because of this restriction?

One of the reasons it is important to cite specific studies in the record is because that public record will be important in any future legal action. Also, our political leaders are usually not fully familiar with the scientific process. They need specific examples to inform their own views. How will your work be impacted scientists? How will community members be affected if certain public health and safety protections are not enacted based on good science?

A week of collective action

A coalition of groups including 500 Women Scientists, EarthJustice, and the Public Comment Project are joining forces to mobilize as many public comments as possible during the week of May 20-26.  This coordinated action—the National Week of Public Comments on EPA’s “Restricting Science” Policy—is part of the overall effort of Science Rising, which is working to defend science and its crucial role in public policy and our democracy more broadly. You can participate by sending in your comment and letting us know that you did.

This is still our government, our democracy, and our voices need to be heard.

A Response to Roberts and Payne

UCS Blog - All Things Nuclear (text only) -

A recent letter by Bradley Roberts and Keith Payne responds to a Japanese press account of a blog post that discussed Japanese Vice Foreign Minister Takeo Akiba’s 25 February 2009 presentation to a US congressional commission on US nuclear weapons policy. Reports of Mr. Akiba’s presentation created some controversy in the Japanese Diet, since he may have made statements that contradict the spirit, if not the letter, of a long-standing Diet resolution. That resolution, adopted decades ago and reaffirmed many times since, prohibits any transportation of US nuclear weapons into Japanese territory.

The 1969 US-Japan agreement granting the United States “standby retention and activation” of nuclear weapons storage sites on US military bases in Okinawa.

Roberts and Payne mistakenly claim the document on which the post was based does not exist, despite the fact that it was published on the website of a non-governmental Japanese arms control expert more than a month before their letter appeared in the Japan Times.

The document exists.

Roberts and Payne also claimed that because the Japanese participants were “off-the-record” no records were kept. This too is incorrect. There may be no transcript of Mr. Akiba’s presentation, but an April 10 reply by the cabinet to questions from Rep. Seiji Osaka confirmed that the Foreign Ministry kept records on the proceedings of the US commission where representatives of the ministry were present. The same reply was repeated in a document issued on April 13 by the Security Treaty Division of the North American Bureau of the Ministry of Foreign Affairs. The United States Institute of Peace (USIP) also archived documents that describe the discussions between the commissioners and the Japanese officials.

Records were kept.

Meetings are often held “off the record” to allow public officials to express their personal opinions. Rep. Osaka asked the Abe government whether the Foreign Ministry officials who participated in the proceedings of the US commission were acting in a personal or an official capacity. The April 10 reply by the cabinet confirmed that all of the Japanese officials who participated in the proceedings were acting in an official capacity under the direction of Foreign Minister Nakasone.

The three-page document Akiba presented to the US commission is therefore an official record of the Japanese government’s views on the role of US nuclear weapons in the defense of Japan. So are any oral statements Akiba and the other Japanese officials gave to the commission.

Some of those oral statements were recorded in hand-written notes on the margins of the document. Those notes contain an abbreviated rendition of a conversation between Akiba and James Schlesinger in which the Japanese minister gives a favorable response to Schlesinger’s question about building nuclear weapons storage facilities in Okinawa. Roberts and Payne recall the conversation. They note that Akiba “clearly set out the three non-nuclear principles,” which the Japanese official does in the hand-written notes on his conversation with Schlesinger. Yet Roberts and Payne neglected to mention Mr. Akiba also said that “some quarters talk about revising the third principle,” which would be necessary if the United States were to bring nuclear weapons into Japan or prepare to store them in Okinawa.

The language in the hand-written notes makes it difficult to assess whether Mr. Akiba is among those who want to revise the third principle. But his favorable response to Schlesinger’s proposal to construct nuclear weapons storage sites in Okinawa deserves more careful scrutiny.

Notes are often incomplete and sometimes inaccurate. Memories, especially of a conversation that took place nine years ago, can be faulty. One way to help clarify this matter is for the United States Institute of Peace (USIP) to release the Foreign Ministry from its promise of confidentiality and encourage the ministry to respond to Diet requests for access to its records. USIP should also grant the Diet access to all materials on the proceedings of the commission it may hold in its archives. Greater transparency, from both sides, is the best way to set the record straight.

Here’s Why Seas Are Rising. Somebody Remind the Wall Street Journal.

UCS Blog - The Equation (text only) -

Ice sheets on land in Greenland and Antarctica are melting, adding water to the world's oceans. Photo: NASA

On May 15, the Wall Street Journal published a commentary by Fred Singer which argued that rising sea levels are unrelated to global warming, that they won’t be much of a problem, and that there’s little we can do about them. Singer, whose history of disingenuous attacks on science on behalf of the tobacco, fossil fuel and other industries goes back nearly 50 years, is wrong on all counts.

Singer acknowledges that “sea levels are in fact rising at an accelerating rate,” but then argues that “the cause of the trend is a puzzle.” Perhaps Singer is puzzled as to the causes, but science is crystal clear about this. Worse, we know that without strong policy to limit CO2 emissions, the rising water will continue to accelerate, inundating all the coastal cities of the world.

Fundamentally, there are three reasons why the ocean is rising at an accelerating rate

  1. Adding heat to things causes them to change temperature (1st Law of Thermodynamics)
  2. Seawater volume increases with temperature (thermal expansion)
  3. Adding a volume of water to the oceans from melting land ice causes them to increase in height (conservation of water)

All three of these principles (conservation of energy and mass, and the thermal expansion of water) are bedrock principles of physics which have been established for centuries and can easily be verified by direct observation.

The effect of CO2 on the absorption of radiation has been understood for 160 years.

The effect of rising CO2 on the energy budget of the Earth is directly measured in the laboratory, from towers, from balloons and aircraft, and from satellites. We measure precisely how much extra heat is absorbed globally by CO2 because of burning carbon, all the time. Adding heat to the world causes it to warm up, for precisely the same reason that adding heat to a pot of water on the stove causes the temperature of the water to increase.

When water warms up, it expands

The precise increase in seawater volume with temperature is easy to measure and extremely well known. Nearly all the resulting change in heat content (more than 90%) is in the oceans, where temperatures are measured at all depths by thousands of autonomous instruments floating at different depths. Oceanographers know the three-dimensional temperature and density of the oceans worldwide to amazing precision from these floating sensors. Since 1992, we have also tracked rising sea levels everywhere on Earth by measuring the height of the ocean from space using laser altimeters. The expansion of the warming seas measured by the floats is completely consistent with the rising surface of the water measured by the lasers.

As the world warms, ice sheets on land in Greenland and Antarctica are melting, adding water to the oceans.

Just as we directly measure the effect of CO2 on heat and the effect of that heat on ocean temperatures and sea level, we also have satellite measurements of the volume and mass of the great ice sheets. The height of the ice is measured by radar and the mass is measured by the gravitational pull of the ice itself. These data show precisely how much water from the ice sheets in both Greenland and Antarctica is added to the oceans each year. The total rise in sea level is completely consistent with the additions from land ice and ocean expansion, all of which are precisely measured all over the Earth and to the bottom of the oceans.

The reason that sea levels are rising faster and faster is because every bit of coal, oil, and gas we burn adds to the CO2 in the atmosphere, absorbing more of the Earth’s radiant heat, and contributing more to the thermal expansion of seawater and the loss of land ice. This is not a mystery. It’s extremely well understood and documented by millions of direct measurements.

Without strong policy, coastal cities will be inundated and abandoned

The oceans will continue to rise faster and faster unless the world implements very strong policy to quickly reduce and eventually eliminate the burning of fossil fuels. Depending on how quickly these policies are put in place, seas will rise between one and eight feet by 2100, according to a 2017 report from the federal government, released under the Trump Administration. Without strong policy, coastal cities in the US and around the world will be inundated and abandoned.

Rising oceans are but one devastating consequence of inexorable global warming caused by burning fossil fuels. Luckily, it’s not too late to prevent the damage to the world and our economy. Nearly all the world’s nations have agreed to limit warming by cutting emissions. Maybe somebody should tell Fred Singer.

NASA

Five Things You Should Know About EPA’s Proposed Giant Step Backward on the Safety of Chemical Facilities

UCS Blog - The Equation (text only) -

Kentucky Army National Guard members training for disaster responseMembers of the Kentucky National Guard receive a brief on extracting the mock injured and wounded during the early stages of their external evalutation at Muscatatuck Urban Training Center in Butlerville, Ind. May 23. The purpose of the exercises and evaluation is to prepare the Kentucky Guard’s chemical, biological, radiological, and nuclear (CBRN) teams to respond to such attacks and disasters. Photo: Spc. David Bolton, Public Affairs Specialist, 133rd Mobile Public Affairs Detachment, Kentucky Army National Guard/CC BY 2.0 (Flickr)

As one of his first acts in office, EPA Administrator Scott Pruitt decided to put on hold the implementation of new regulations to improve the safety of chemical facilities around the country. Those regulations, finalized in 2017, called for consideration of safer technologies, better information for communities and first responders that are on the front lines of accidents and other incidents, better planning for accidents and disasters, and improvements in response capabilities including coordination and practice sessions with local first responders. These changes were made to update the so-called Risk Management Plan rule, last significantly modified in 1996.

Now, the EPA has proposed a new rule, modifying the 2017 regulations without ever implementing them. The new proposal, soon to be published in the federal register and open for a 60-day public comment period, basically rescinded all new requirements with a few minor exceptions and takes us back to 1996 at best. The justification by Pruitt’s EPA is that it will reduce industry costs if they don’t have to do these things, by $88 million. Rolling back these critical protections in the wake of a devastating hurricane season that demonstrated the need for increased planning for these chemical facilities and after there have been 43 reported incidents at chemical facilities since the rule was initially delayed demonstrates a lack of leadership and commitment to public health at the EPA.

The short summary is that Pruitt’s EPA has eliminated or weakened every provision of the rule to eliminate protection for fenceline communities or workers. The justification is possibly saving $88 million dollars in compliance and at the expense of immense public health and safety benefits to communities which were not calculated in the proposal.

When the Public Comment period is open, the EPA will hold exactly one public hearing to receive input in addition to written comments. That hearing will be in EPA Headquarters in DC, not in any one of the communities like Houston, TX and Wilmington, DE affected by the risks of chemical facilities, and frankly out of reach in terms of cost to most grassroots or local organizations. That’s a shame. It also means that the written comments submitted to the EPA are all the more important as the delay of the previous rule, and certainly this new proposal if it is finalized, are being challenged in court, including by the Union of Concerned Scientists.

So here are five things you should note as you consider commenting on the new EPA proposal.

  • The 2017 rule required chemical facilities to evaluate and consider safer technology and alternatives defined by the EPA itself as “a variety of risk reduction or risk management strategies that work toward making a facility and its chemical processes as safe as possible.” Seems reasonable that these should be considered by facilities everywhere to reduce risks to workers, communities and first responders. The idea is to reduce the risks with safer alternatives before an accident or disaster takes place. The preventive medicine of the chemical facility so to speak. The new proposal completely eliminates this requirement for facilities to look at preventative, safer alternatives. The justification for the rollback was the costs to industry, without any consideration of benefits to the public or to the mission of the EPA (to serve the public interest).
  • Prior to the new rules set in 2017, it was nearly impossible to get much information about what chemicals were being held at a facility in a timely and regularly updateable way. To obtain any information, you had to prove you lived in the neighborhood around the facility and go to a special EPA reading room when it was open—if it was available, you were not allowed to use a copier, computer or scanner and you couldn’t take anything away. The 2017 rules eased these restrictions somewhat by allowing communities to ask for information and requiring companies to be forthcoming in a timely way. The new proposal eliminates that option. It goes back to a system where the public, including first responders, have little or no information in case of a chemical disaster or emergency chemical release in their neighborhood.
  • Prior to 1996, chemical facilities could leave most of the response capability for accidents and disasters up to the local government, with the cost borne by local taxpayers, not the company. That burden was only partially shifted in 2017 with greater participation and coordination requirements put on companies to work with local government and groups. The new proposed rule takes a step back again and weakens those requirements, though there would be some requirement for joint exercises to practice responding to an accident every few years. And they propose eliminating the requirement to report on the results of those exercises to improve performance.
  • Under the 2017 rules, when an accident occurred, an incident analysis would be required along with an analysis of the causes of the incident. Now Pruitt’s EPA is eliminating that requirement to analyze and report on accidents and their causes and make that information available to the community.
  • And, in 2017 the rules required the industry to hire third-party independent auditors to evaluate compliance with the rules and to investigate problems. The EPA is now proposing to eliminate that requirement and continue to allow companies to audit themselves.

Should you submit a comment? Yes! Because this proposal makes all of us less safe. It is simply unacceptable that we cannot do a better job of preventing and responding to the thousands of chemical accidents that occur every year in this country.

Sonny Perdue’s USDA Is in Bed with Big Pork. That’s Really Bad for Everyone Else.

UCS Blog - The Equation (text only) -

North Carolina hog barns with waste lagoons. Photo courtesy Waterkeeper Alliance Inc./Flickr

In his first year running the US Department of Agriculture, Secretary Sonny Perdue has displayed a curious tendency to say things he really shouldn’t. The most recent example is his striking off-the-cuff comment about a big court judgment won by neighbors of a massive hog farm and its stinking cesspools in North Carolina. Perdue told reporters he was not familiar with the case, in which a US District Court jury leveled a landmark $50 million verdict against Murphy-Brown LLC, a subsidiary of pork giant Smithfield Foods. But that didn’t stop him from calling the jury’s decision “despicable.”

Secretary Perdue’s alignment with big corporate interests over the public interest has been clear for a while. But his knee-jerk reaction to this case, along with related pending actions at his USDA, suggests that he is willing to throw workers, farmers, rural residents, consumers, and clean air and water overboard to protect Big Pork’s bottom line.

“Nuisance” is putting it mildly

When the jury in the Murphy-Brown case (a so-called “nuisance” suit filed on behalf of a group of 10 neighbors) handed down its decision on April 26, fear surely rippled through the pork industry. Led by Iowa, North Carolina, and Minnesota, annual US pig production exceeded 110 million animals in 2014, with the total national swine herd that year valued at $9.5 billion. In 2018, the industry is forecast to produce even more pigs—an estimated 134 million. The vast majority of those animals will be raised in CAFOs (confined animal feeding operations), which generate huge quantities of concentrated manure waste. In North Carolina alone, hog and poultry CAFOs produce 15,000 Olympic-size pools’ worth of waste each year.

In that state, there’s a long line of angry CAFO neighbors awaiting their chance to demand justice for the harm these operations cause. More than 500 plaintiffs have filed 26 lawsuits alleging damage from Murphy-Brown’s operations. The company’s practice of holding liquified manure in open pits and spraying the excess on nearby fields, common in the CAFO industry for decades, leaves a reeking stench over nearby communities. Residents, most of them working class and black, complain of health problems—which researchers have shown can include nausea and respiratory problems such as asthma—along with reductions in property values and quality of life from the CAFOs that built up around them. If juries in the other North Carolina cases (and in CAFO lawsuits elsewhere, like one filed this week by Iowa residents against that state) decide in favor of plaintiffs, it could be a watershed moment for environmental justice—and may force the industry to change.

In the weeks since the North Carolina jury’s bombshell announcement, the judge in the case has bowed to a state law that caps punitive damage awards, reducing the $50 million award to a mere $3.25 million. Still not exactly small potatoes, but the reduction must have prompted sighs of relief from the board rooms of Murphy-Brown, parent company Smithfield Foods, and WH Group, the Chinese company that owns Smithfield and is the world’s largest pork company.

And there’s more for giant pork companies to smile about. In addition to state laws that have long enabled the pork industry to operate profitably at the expense of its neighbors and continue to protect it from major consequences, Big Pork appears to have the Trump administration on its side.

Perdue backs Big Pork over farmers…

Two regulatory actions initiated by the USDA in its first year under Secretary Perdue show how it has favored the big corporations that process and sell US pork at the expense of small farmers and workers in the industry. First, last fall the department announced it would withdraw the Farmer Fair Practices Rules, Obama-era rules that would have made it easier for livestock and poultry farmers to sue meat processing companies with which they have contracts and to protect farmers from unfair and predatory corporate practices. In response, a group of farmer plaintiffs and the Organization for Competitive Markets filed suit in December, calling the rules’ cancellation arbitrary and capricious, a gift to the industry, and a failure to protect small farmers.

Speaking to reporters as part of a farm tour in Ohio last month, Perdue suggested farmers are on their own:

There are farmers there, some of which will not survive because other people do it better. That’s the American capitalistic society. The best producers thrive and provide, and the others find another industry where they can thrive.

That’s a startling statement from a guy who claims to serve the interest of farmers—Perdue calls them the USDA’s “customers,” and they still largely support the Trump administration (though their support is slipping).

…and workers and food safety, too

In a related action, the USDA in January proposed a rule it claims will “modernize” swine slaughter. In fact, by reducing the number of trained government food inspectors in pork processing plants and allowing plants to operate at higher speeds (something the administration has also tried in poultry plants). These changes would likely increase rates of worker injury and incidents of meat contamination, and the proposed rule faces broad opposition from food safety, labor, and animal welfare groups. More than 83,500 people wrote to the USDA about it during a public comment period that closed May 2, and dozens of members of Congress have also entered the fray. In their letter to Secretary Perdue, 63 members of the House of Representatives (including several from leading pork states) cited the danger posed by the hog slaughter rule to workers and urging the secretary to withdraw it.

As various lawsuits wind their way through the courts and the swine slaughter rule proceeds through the regulatory process, we’ll see whether Secretary Perdue’s USDA backs down or continues to back Big Pork. Meanwhile, the perception of the Trump administration’s coziness with the industry is peaking in a weird way: in the online video game Bacon Defender, players navigate an animated high speed pork plant—complete with falling poop emojis and oddly Trump-like voice effects—armed only with a mustard-shooting hot dog. “Even a novice Bacon Defender player quickly learns that at higher speeds feces can contaminate your food more easily,” say the game’s creators.

I wish I had a sad poop emoji for that.

 

High Energy Arc Faults and the Nuclear Plant Fire Protection IOU

UCS Blog - All Things Nuclear (text only) -

Last year, we posted a commentary and an update about a high energy arc fault (HEAF) event that occurred at the Turkey Point nuclear plant in Florida. The update included color photographs obtained from the Nuclear Regulatory Commission (NRC) via a Freedom of Information Act request showing the damage wrought by the explosion and ensuing fire. Neither the HEAF event or its extensive damage surprised the NRC—they had been researching this fire hazard for several years. While the NRC has long known about this fire hazard, its resolution remains unknown. Meanwhile, Americans are protected from this hazard by an IOU. The sooner this IOU is closed out, the better that Americans in jeopardy will be really and truly protected.

What is a HEAF?

The Nuclear Energy Agency (NEA), which has coordinated international HEAF research efforts for several years, defines HEAF this way: “An arc is a very intense abnormal discharge of electrons between two electrodes that are carrying an electrical current. Since arcing is not usually a desirable occurrence, it is described as an arcing fault.”

Nuclear power plants generate electricity and use electricity to power in-plant equipment. The electricity flows through cables or metal bars, called buses. An arc occurs when electricity jumps off the intended pathway to a nearby metal cabinet or tray.

Electricity is provided at different voltages or energy levels for different needs. Home and office receptacles provide 120-volt current. Nuclear power plants commonly have higher voltage electrical circuits carrying 480-volt, 4,160-volt, and 6,900-volt current for motors of different sizes. And while main generators at nuclear plants typically produced electricity at 22,000 volts, onsite transformers step up the voltage to 345,000 volts or higher for more efficient flow along the transmission lines of the offsite power grid.

How is the Risk from HEAF Events Managed?

Consistent with the overall defense-in-depth approach to nuclear safety, HEAF events are managed by measures intended to prevent their occurrence backed by additional measures intended to minimize consequences should they occur.

Preventative measures include restrictions on handling of electrical cables during installation. Limits on how much cables can be bent and twisted, and on forces applied when cables are pulled through wall penetrations seek to keep cable insulation intact as a barrier against arcs. Other preventative measures seek to limit the duration of the arc through detection of the fault and automatic opening of a breaker to stop the flow of electrical current through the cables (essentially turning the arc off).

Mitigative measures include establishing zones of influence (ZOI) around energized equipment that controls the amount of damage resulting from a HEAF event. Figure 1 illustrates this concept using an electrical cabinet as the example Electrical cabinets are metal boxes containing breakers, relays, and other electrical control devices. Current fire protection regulatory requirements impose a 3-foot ZOI around electrical cabinets and an 18-inch ZOI above them. Anything within the cabinet and associated ZOI is assumed to be damaged by the energy released during a HEAF event. Sufficient equipment must be located outside the affected cabinet and its ZOI to survive the event and adequately cool the reactor core to prevent meltdown.

Fig. 1 (Source: Nuclear Regulatory Commission)

Even with these preventative and mitigative measures, NEA recognized the hazard that HEAF events poses when it wrote in a May 2017 report: “The electrical disturbance initiating the HEAF often causes loss of essential electrical power and the physical damage and products of combustion provide significant challenges to the operators and fire brigade members handling the emergency. It is clear that HEAFs present one of the most risk significant and challenging fire scenarios that a [nuclear power plant] will face.”

What is the Problem with HEAF Risk Management?

Actual HEAF events have shown that the preventative and mitigative measures intended to manage the hazard have shortcomings and weaknesses. For example, arcs have sometimes remained energized far longer than assumed, enabling the errant electricity to wreak more havoc.

Additionally, HEAF events have damaged components far outside the assumed zones of influence, such as in the Turkey Point event from March 2017. In other words, the HEAF hazard is larger than its defenses.

How is the HEAF Risk Management Problem Being Resolved?

On March 11, 2011, an earthquake offshore of Japan and the tsunami it spawned led to the meltdown of three reactors at the Fukushima Daiichi nuclear plant. That earthquake also caused a HEAF event at the Onagawa nuclear plant in Japan. The ground motion from the earthquake prevented an electrical circuit breaker from opening to limit the duration of the arc. The HEAF event damaged equipment and started a fire (Fig. 2). Because the fire brigade could not enter the room due to heat and smoke, the fire blazed for seven hours until it had consumed all available fuel. As an NRC fire protection engineer commented in April 2018, “If Fukushima wasn’t occurring, this is probably what would have been in the news headlines.” Onogawa was bad. Fukushima was just worse.

Fig. 2 (Source: Nuclear Regulatory Commission)

Research initiated in Japan following the Onagawa HEAF event sought to define the factors affecting the severity of the events. Because the problem was not confined to nuclear power plants in Japan, other countries collaborated with the Japanese researchers in pursuit of a better understanding of, and better protection for, HEAF events.

The NRC participated in a series of 26 tests conducted between 2014 and 2016 using different types of electrical panels, bus bar materials, arc durations, electrical current voltages, and other factors. The results from the tests enabled the NRC to take two steps.

First, the NRC entered HEAF events into the agency’s generic issues program in August 2017. In a related second step, the NRC formally made the owners of all operating US nuclear power plants aware of this testing program and its results via an information notice also issued in August 2017. The NRC has additionally shared its HEAF information with plant owners during the past three Regulatory Information Conferences and several other public meetings and workshops.

The NRC plans a second series of tests to more fully define the conditions that contribute to the severity of HEAF events.

How Are HEAF Events Tested?

Test 23 during the Phase I program subjected a 480-volt electrical cabinet with aluminum bus material to an arc lasting 7.196 seconds. Figure 3 shows the electrical cabinet with its panel doors opened prior to the test. A pointer on the left side of the picture shows the location where the arc was intentionally caused.

Fig. 3 (Source: Nuclear Energy Agency)

To induce an arc for the test, a wire was wrapped around all three phases of the 480-volt alternating current connectors within one of the cabinet’s panels as shown in Figure 4. On the right edge of the picture is a handswitch used to connect or disconnect electrical power flowing into the cabinet via these buses to in-plant electrical loads.

Fig. 4 (Source: Nuclear Energy Agency)

Instrumentation racks and cameras were positioned around the cabinet being tested. The racks included instruments measuring the temperature and pressure radiating from the cabinet during the HEAF event. High-speed, high definition cameras recorded the progression of the event while infrared cameras captured its thermal signature. A ventilation hood positioned over the cabinet connected to a duct with an exhaust fan conducted smoke away from the area to help the cameras see what was happening. More importantly, the ventilation duct contained instruments measuring the heat energy and byproducts released during the event.

Fig. 5 (Source: Nuclear Regulatory Commission)

What Are the HEAF Test Results?

For a DVD containing reports on the HEAF testing conducted between 2014 and 2016 as well as videos from the 26 tests conducted during that period, send an email with your name and address to RES_DRA_FRBQnrc.gov. Much of the information in this commentary comes from materials on the DVD the NRC mailed me in response to my request.

Test 4 in the Phase I Program subjected a 480-volt electrical cabinet with aluminum bus material to an arc lasting only 0.009 seconds (i.e., 9 milliseconds). The short duration arc had minimal consequences, entirely missed if one blinks at the wrong time while watching the video. This HEAF event did not damage components within the electrical cabinet, yet alone any components outside the 3-foot zone of influence around it.

Test 3 in the Phase I Program subjected a 480-volt electrical cabinet with copper bus material to an arc lasting 8.138 seconds. The longer duration arc produced greater consequences than in Test 4. But the video shows that the consequences are largely confined to the cabinet and zone of influence.

Test 23 in the Phase I Program subjected a 480-volt electrical cabinet with aluminum bus material to an arc lasting 7.196 seconds. The voltage level and arc duration for Test 23 were essentially identical to that for Test 3, but the consequences were significantly different. Aluminum behaved differently than copper during the HEAF event, essentially fueling the explosion and ensuing fire. As a result, the damage within the cabinet, zone of influence, and even beyond the 3-foot zone of influence was much greater. For example, some of the instruments on the rack positioned just outside the 3-foot zone of influence were vaporized.

Until debris from the event obscured the lens of a camera positioned many feet outside the 3-foot zone of influence, a side view of the Test 23 HEAF event showed it was a bigger and badder event than the HEAF event in Test 3 and the HEAF event in Test 4.

Figure 6 shows the electrical cabinet with its panel doors open after Test 23. The cabinet clearly looks different from its pre-test appearance (see Figure 4). But this view does not tell the entire story.

Fig. 6 (Source: Nuclear Energy Agency)

Figure 7 shows the left side of the electrical cabinet after Test 23. The rear upper left corner of the cabinet is missing. It was burned and/or blown away by the HEAF event. The cabinet is made of metal, not wood, plastic, or ice. The missing cabinet corner is compelling testimony to the energy released during HEAF events.

Fig. 7 (Source: Nuclear Energy Agency

Tests 3, 4 and 23 all featured electrical cabinets supplied with 480-volt power.

Tests 4 and 23 each featured aluminum bus material. Test 4 had negligible consequences while Test 23 had significant consequences, attesting to the role played by arc duration. The arc lasted 0.009 seconds in Test 4 while it lasted 7.196 seconds in Test 23.

Tests 3 and 23 featured arcs lasting approximately 8 seconds. Test 23 caused substantially greater damage within the electrical cabinet and beyond the 3-foot zone of influence due to the presence of aluminum rather than copper materials.

How Vulnerable Are US Nuclear Plants to HEAF Events?

The Phase I series of tests revealed that HEAF events depend on the voltage level, the conducting material (i.e., copper, iron, or aluminum), and the arc duration. The higher the voltage, the greater the amount of aluminum, and the longer the arc duration, the greater the consequences from HEAF events.

The NRC received results in 2017 from an industry survey of US nuclear plants. The survey showed that the plants have electrical circuits with voltage levels of 480, 4160, 6900, and 22000 volts. The survey also showed that while some plants did not have electrical circuits with components plated with aluminum, many did.

As to arc durations, actual HEAF events at US plants have involved arc durations longer than the 8 seconds used in Tests 3 and 23. The May 2000 event at Diablo Canyon lasted 11 seconds. The March 2010 event at HB Robinson last 8 to 12 seconds. And the June 2011 event at Fort Calhoun last 42 seconds and likely would have lasted even longer had operators not intervened by manually opening an electrical breaker to end the event.

So, many US nuclear plants have all the ingredients necessary for really nasty HEAF events.

What Might the Fixes Entail?

The testing program results to date suggest a tiered approach to the HEAF event solution. Once the key factors (i.e., combinations of voltage levels, materials, and arc durations) are definitively established, they can be used to screen out configurations within the plant where a HEAF event cannot compromise safety margins. For example, a high voltage electrical cabinet with aluminum bus material and suspect arc duration limiters might need no remedies if it is located sufficiently far away from safety components that its HEAF vaporization carries only economic rather than safety implications. Similarly, configurations with voltage levels and materials that remain bound by the current assumptions like the 3-foot zone of influence would require no remedies.

When a configuration cannot be screened out, the remedy might vary. In some cases, it might involve providing more reliable, quick-acting fault detection and isolation systems that limit the duration of the arc. In other cases, replacing aluminum buses with copper or iron buses might be a suitable remedy. And the fix might be simply installing a protective wall between an electrical cabinet and safety equipment.

Further HEAF testing will expand knowledge of the problem, thus more fully informing the decisions about effective solutions.

UCS Perspective

It has been known for many years now that HEAF events could cause wider damage than currently assumed in designing and applying fire protection measures. As a result, a fire could damage primary safety systems and their backups—the very outcome the fire protection regulatory requirements are intended to prevent.

This is normally the time and spot where I chastise the NRC for dragging its feet in resolving this known safety hazard. But while years have passed since the HEAF hazard flag was first raised, the NRC’s feet have been busy. For while it was known that HEAF events could cause greater damage than previously realized, it was not known what factors played what roles in determining the severity of HEAF events and the damage they inflict. The NRC joined regulatory counterparts worldwide in efforts designed to fill in these information gaps. That knowledge was vitally needed to ensure that a real fix, rather than an ineffective band-aid fix, was applied.

That research took time to plan and conduct. And further research is needed to fully define the problem to find its proper solution. In the meantime, the NRC has been very forthcoming with plant owners and the public about its concerns and associated learnings to date.

While the NRC’s efforts to better understand HEAF events may be justified, it’s worth remembering that the agency’s intentions and plans are little more than IOUs to the millions of Americans living close to vulnerable nuclear plants. IOUs provide zero protection. The NRC needs to wrap up its studies ASAP and turn the IOUs into genuine protection.

Bipartisan Outrage as EPA, White House Try to Cover Up Chemical Health Assessment

UCS Blog - The Equation (text only) -

Photo: US Air Force/Senior Airman Julianne Showalter

Citing a potential “public relations nightmare,” the Trump administration successfully stopped the publication of a study measuring the health effects of a group of hazardous chemicals found in drinking water and household products throughout the United States. Many of the contaminated sites are on military bases across the country and affect military families directly. Multiple Republicans and Democrats have expressed concern about the censorship and have called for the report to be released, and Trump administration officials are scrambling to contain the political fallout. 

The two email chains (here and here) show the exchanges among White House, Environmental Protection Agency (EPA), and Department of Defense (DoD) attempting to strong-arm the Agency for Toxic Substances and Disease Registry (ATSDR) into censoring the report. The emails were released to UCS by the EPA as part of a larger request under the Freedom of Information Act for documents related to an attempt to restrict the types of science that are used in EPA public health protection decisions (the EPA subsequently tried to bury the documents).

The White House tried to cover up a study related to the health impacts of PFAS, a group of chemicals that are often present at dangerous levels around military bases. Firefighting foam used by the military contains PFAS chemicals. Photo: United States National Guard

Politico broke the story on Monday:

Scott Pruitt’s EPA and the White House sought to block publication of a federal health study on a nationwide water-contamination crisis, after one Trump administration aide warned it would cause a “public relations nightmare,” newly disclosed emails reveal.

The intervention early this year — not previously disclosed — came as HHS’ Agency for Toxic Substances and Disease Registry was preparing to publish its assessment of a class of toxic chemicals that has contaminated water supplies near military bases, chemical plants and other sites from New York to Michigan to West Virginia.

The study would show that the chemicals endanger human health at a far lower level than EPA has previously called safe, according to the emails.

Nancy Beck, one of the EPA political appointees with ties to the chemical industry involved in the effort to prevent the study from being released, knows very well how one agency can put pressure on another. She helped the Department of Defense slow down EPA efforts to protect drinking water from perchlorate, an ingredient in rocket fuel, when she worked in the White House under President George W. Bush.

Both Republicans and Democrats have expressed concern about the cover-up and demanded the ATSDR report be released, including Senator Maggie Hassan (D-NH),  Representative Mike Turner (R-OH), Representative Bryan Fitzpatrick (R-PA), and several Democratic senators including Senate Minority Leader Chuck Schumer (D-NY).

West Virginia Republican Shelley Moore Capito questioned embattled EPA Administrator Scott Pruitt in a Senate hearing today about the EPA’s actions. Administrator Pruitt refused to take responsibility for slowing down the release of the study, but acknowledged that it is important for this kind of health information to be public. West Virginia has had specific problems with PFAS contamination.

This kind of congressional oversight of the administration is crucial as part of our system of government, the checks and balances the founding fathers talked about.  Executive branch actions have direct consequences for public health and the environment. We desperately need more congressional scrutiny of the ways in which science is being suppressed and sidelined in executive branch agencies.

And at least in this case, the pressure is working. According to Inside EPA (paywalled), ATSDR has subsequently begun preparations for releasing the report. Below are more details about this developing story.

A Michigan Department of Environmental Quality employee visits a home to test well water for chemical contaminants. Photo: Michigan DEQ

What are these chemicals?

“PFAS” stands for “per- and polyfluoroalkyl substances.” “PFOS” and “PFOA,” the two most studied PFAS, stand for “perfluorooctane sulfanate” and “perfluorooctanoic acid,” respectively. PFAS are a group of man-made chemicals found in many consumer products (such as non-stick cookware and water-repellent clothing) as well as in firefighting foam used by the military. Studies on PFOA and PFOS have indicated links to cancer, thyroid disease, and immunological effects. Here’s the EPA’s current FAQ on PFAS.

What are more specific health effects?

According to ASTDR, studies have shown certain PFAS may impact fertility; increase cholesterol; elevate cancer risk; interfere with the body’s natural hormones; and negatively affect growth, learning, and behavior of infants and older children.

What is the current EPA guidance on the issue?

In May 2016, EPA established drinking water health advisories of 70 parts per trillion for the combined concentrations of PFOS and PFOA. This number is important because in “PFAS CDC Study 2,” an employee of the White House Office of Management and Budget was worried about the fact that ATSDR’s numbers for minimal risk for some populations went as low as 12 ppt. For more, see EPA’s factsheet on PFAS.

What’s the DoD connection?

The Department of Defense emerges in many PFAS water source contamination stories because DoD’s firefighting foam contains PFOS and PFOA. The Politico story notes that in a March report to Congress, the Defense Department listed 126 facilities where test of nearby water supplies showed the substances exceeded the current safety guidelines. These facilities have caused congressional concern and the Government Accountability Office has studied the issue.

How has the EPA approached PFAS?

Administrator Pruitt has publicly said that he wants to make controlling PFAS a priority and has planned a leadership summit on the issue next week. The summit was planned after the Senate refused to confirm Michael Dourson, President Trump’s nominee to lead EPA’s chemical safety division. North Carolina’s two Republican senators refused to support him for PFAS-related reasons; Dourson’s previous work for the chemical industry recommended dramatically higher “safe” levels of the chemicals than the EPA had found (more here and here).

Mick Mulvaney leads the White House Office of Management and Budget (OMB). OMB has a history of interfering in or slowing down federal agency scientific assessments.

What do the two emails show?

In mid-January, an email chain with EPA political and career employees discussed a call between EPA and the Agency for Toxic Substances and Disease Registry (ATSDR) about PFAS. Both the political and career employees noted that EPA and ATSDR did not entirely agree on the science.

In a January 30 internal email chain, an unnamed White House political appointee flagged for an EPA political appointee that ATSDR’s draft Toxicological Profile for four PFAS (PFOS, PFOA, PFHX, and PFNA) had very low Minimal Risk Level numbers. The OMB employee noted that ATSDR’s release of its draft would have a “huge” response, that the impact to EPA and the Department of Defense would be “extremely painful,” and that releasing the draft would be a “potential public relations nightmare.”

The OMB message was forwarded to three EPA political appointees: chief of staff Ryan Jackson, Assistant Administrator for the Office of Research and Development Richard Yamada; and Nancy Beck. Jackson noted that the ATSDR estimate is 10 times lower than the EPA’s numbers; Beck recommended OMB interagency review; Yamada noted that ORD was going to DoD to discuss. More than three months later, ATSDR still has not released its draft Toxicological Profile, and the agency initially said there are no plans to release it.

How should legitimate scientific disagreements between EPA and ATSDR scientists be handled?

Scientists may or may not agree with the ATSDR analysis. But there’s no way to critique a peer-reviewed study that isn’t public. Further, any legitimate disagreements should be handled among scientists, not negotiated among political appointees.

The White House Office of Management and Budget (OMB) has a role to play in ensuring that agencies talk to one another. But it has also been used to try to alter science for political reasons. UCS has recommended that peer-reviewed scientific documents be shared publicly when sent to OMB for interagency review. The PFAS case is evidence for why this kind of policy is sorely needed.

CSPAN

Building the Right Project: An Engineer’s Perspective on Infrastructure Adaptation to Extreme Weather Events

UCS Blog - The Equation (text only) -

The view from aerial tour of Hurricane Sandy damage of New Jersey's barrier beaches, Nov. 18, 2012. (Official White House Photo by Sonya N. Hebert)

Infrastructure Week 2018 is upon us, and it’s important that we highlight the state of our nation’s infrastructure and why it’s critical to our economy, society, security, and future. So what is the status of our infrastructure?

The National Infrastructure Report Card is issued by the American Society of Civil Engineers (ASCE) every four years. The Report Card offers a comprehensive assessment of our nation’s 16 major infrastructure categories providing information on their conditions and needs, assigning grades and making recommendations to raise those grades. While first issued in 1998, not much has improved. ASCE has yet to give a grade out of the “D” range; in 2017, America’s infrastructure earned a “D+”.

The work to change the depressing state of our infrastructure is daunting, but I try to be calm and take in my unique privilege as a professional engineer to be involved in the many facets of infrastructure and how we need to better plan, design, build, operate, and maintain these critical projects and systems. As an environmental engineer that also has science and policy backgrounds, I am involved in all facets of the infrastructure lifecycle: planning, design, construction, operations, and financing. I also work in the third largest transit agency in the United States and am responsible for the environmental, sustainability, and resiliency efforts associated with the agency’s infrastructure.

Climate change presents engineering challenges

Let’s face it, our infrastructure is crumbling and significant investments are needed to improve our grade. Regardless of your politics, we see evidence of exacerbation of these impacts through the effects of increasing frequency and intensity of extreme weather events: increasing ambient temperature, higher frequency of high heat days, more extreme flooding and inundation, more intense storms, and greater length of droughts and heat waves. These conditions are now more common; and forget about the impacts across the world, you need not look beyond your neighborhood. While infrastructure is traditionally designed to hold up to rare but expected extreme weather events, these events are no longer rare, and their durations and intensities are now well beyond normal expectations.

As an engineer, I am faced with a new set of design challenges that force me to rethink how infrastructure should be planned, designed, constructed, operated and maintained for conditions that are substantially changing in unpredictable ways. As a scientist, I am struggling to define what information I should use to ensure the infrastructure we build remains useful throughout its expected life, and keeps people safe while enhancing their quality of life. And as a person and global citizen – of Filipino ancestry and having visited many parts of the world – I am humbled and amazed seeing how those who have the least are able to survive the harshest of environments and economic conditions. We need not go far, as many of us who live in the poorest of our American neighborhoods have come to adapt to similar conditions, and chose to survive after hurricanes, wildfires, floods, and droughts. Many of us involved in this conversation about infrastructure have differing life experiences and perceptions.

A multi-disciplinary approach to infrastructure planning

The facts about our deteriorating infrastructure and a future with more weather extremes should make us think very hard about how we as a society will continue to maintain our livelihoods and well-being. A unifying philosophy that brings us all to the table should be the realization that maintenance of the built infrastructure has primarily been a neglected element of society; consequently the cost of less (or even no) action has never been so great and the urgent need to address the compounded issue is now! We should look more closely into how resilient communities do it and learn from them. To design for a resilient future that can handle more extremes, we must upend some engineering paradigms and approach solutions in inclusive, collaborative, multi-disciplinary, and multi-sectoral ways.

As the executive who oversees the implementation of environmental compliance and sustainability at a major public transportation agency, I am immersed in a transportation revolution here in Los Angeles. This revolution goes beyond pure transportation projects, but involves all the things that the transportation system touches or connects. And infrastructure, transportation in particular, touches everything – energy, water, mobility, housing – and is affected by all types of extreme events – heat, droughts, floods, wildfires, and sea level rise.  Because engineers can also be systems thinkers, I get pulled into a variety of situations where not only environmental issues need to be resolved, but other topics are common fare: policy deliberations, energy resiliency, climate change impacts, alternative financing, social equity, fresh food access, electrification, and of course engineering and science, among others. This multi-faceted approach, which also requires people skills, understanding of human behavior and finding common ground, is fundamental to advancing infrastructure solutions that will function under a future with more extremes.

Promising Developments to Integrate Climate Science into Infrastructure Standards

The Climate-Safe Infrastructure Working Group, under the California Department of Natural Resources, is a pioneering effort to foster this needed cross-discipline dialogue by bringing together climate scientists, engineers, architects, and other professionals to discuss how to incorporate climate change impacts into infrastructure. I was appointed as a member of this Working Group, and with my fellow members have been deliberating how to integrate scientific data concerning projected climate change impacts into state infrastructure engineering and develop and make specific recommendations to the California Legislature and the Strategic Growth Council later in 2018.

Our Working Group’s task gets to the core of making a major overhaul in the way infrastructure projects are planned, designed, constructed, and operated. We are grappling with questions on risk and liability, governance, equity, means and methods of construction, and most importantly identification of the gaps from translating science into practice are debated and discussed. How does this information get incorporated into the standards and practices of civil engineering and architecture? How can the workforce, who have been trained to plan, design, build, operate and maintain infrastructure in a certain way, strategically transition to incorporate modifications that account for new and changing environmental conditions, as well as integrate natural infrastructure solutions? How can financial instruments be used to minimize infrastructure risk to public health, safety, and well-being? The experience has reminded me of the community meetings I often lead or attend: seeking input, debating on the solutions, and in the end, gaining consensus on what is best to make infrastructure serve all stakeholders while simultaneously promoting social cohesion and economic development. I anticipate this to be the flavor of our final report.

The disconnect between disciplines and a lack of an integrated approach across jurisdictions is recognized as a problem by many in the engineering and infrastructure fields. Agencies like mine, the Los Angeles County Metropolitan Transportation Authority (LA Metro), have navigated through this dilemma by incorporating into our design criteria and specification the requirements to build climate-safe infrastructure based on information we know now.

In addition, the American Society of Civil Engineers has been advancing new approaches to integrate climate science into infrastructure. Under Canon 1 of ASCE’s Code of Ethics, engineers have the obligation to hold “safety, health, and welfare paramount and shall strive to comply with the principles of sustainable development in the performance of their professional duties”. ASCE Policy Statements 360 and 418, about climate impacts and the role of civil engineers in sustainability, are key drivers to the execution of this obligation. ASCE’s National Committee on Sustainability (COS) is working on the continued development of a Sustainable Infrastructure Standard as well as an ASCE Policy for infrastructure owners to recognize the value and importance of building sustainable infrastructure. The COS is working hand in hand with the ASCE Committee on Adaptation to a Changing Climate in ensuring infrastructure resiliency and sustainable infrastructure principles and frameworks align with one another.

Investors are listening as well. Here at LA Metro, we are using the revenues generated from the sale of our low carbon fuel standards carbon credits to exclusively invest in carbon emissions reducing strategies, energy conservation and resiliency, renewable energy, and similar projects. We tendered about $500 million in Climate Bond certified Green Bonds in October 2017 for others to invest in our transportation related projects. My invitation to participate in two important symposia on how to finance and make the business case for sustainable infrastructure here in California  in February 2018 and Massachusetts in March 2018 creates a significant degree of personal excitement and inspiration to do more with the financial community to advance climate resilience.

Advancing a new engineering paradigm

There is no other time to do more than now.

Whether we like it or not, infrastructure plays a major role in ensuring that we as a species survive the increasing negative impacts of extreme weather events. But the cost of ignoring infrastructure investments is mounting. More importantly, we need to reassess how we continually value the benefits of a well maintained infrastructure. We need to build the right project as much as we need to build the project right. While I see advances on many fronts, we need more engineers and our partners to step up and take a leadership role in advancing this new paradigm.

Finally, many of these discussions have concentrated on the consideration of the most vulnerable populations or those who are “not in the room” with the professionals. This is not about working to relieve these communities of their burdens but instead all about how we learn from them. With all the tools we have at our disposal, we need to re-think and reassess such tools in the context of how the most vulnerable of communities survive significant stressors. Let’s step-up our active engagement with them for that very reason.

 

Dr. Cris B. Liban, P.E., ENV SP is a professional engineer and a Fellow of the American Society of Civil Engineers, with a focus on environment and transportation. He has chaired or participated in multiple research panels through his involvement with the National Academies of Sciences’ Transportation Research Board, translating research into policy through his work as an environmental executive and political appointee in federal, state, county, and city governments, currently as the Executive Officer, Environmental Compliance and Sustainability at the Los Angeles County Metropolitan Transportation Authority. He is also the Chair of the National Committee on Sustainability of the American Society of Civil Engineers. More on Dr. Liban’s work here.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Among President Trump’s Dismal Judicial Nominees, Wendy Vitter Stands Out for Promoting Unscientific Myths

UCS Blog - The Equation (text only) -

Wendy Vitter has been nominated by President Trump for a lifetime appointment to the U.S. District Court in Louisiana, and is expected to receive a vote in a Senate committee tomorrow. Vitter has a track record of promoting anti-science myths which call into question her capacity to impartially evaluate evidence and expert testimony as a future judge. Senators should think long and hard if they want someone with this kind of judgement on the bench.

Vitter is on record perpetuating the myth that abortion causes breast cancer. When speaking on a panel called Abortion Hurts Women’s Health, Vitter claimed that there is a “connection between cancer and post-abortive women.” The American Cancer Society has rigorously assessed this claim and dismissed it as false. So have the World Health Organization and numerous other medical associations.

Wendy Vitter answers questions at her April 2018 confirmation hearing. Vitter, who received the lowest “qualified” rating from the American Bar Association, is President Trump’s nominee for a lifetime judicial appointment.

At the panel, Vitter publicly urged people to pressure medical providers to distribute a brochure titled “The Pill Kills.” One of the most pernicious and unscientific myths included is the statement that hormonal birth control causes “spontaneous abortions” (the pill actually prevents eggs from ever being fertilized in the first place). The same literature claims that birth control causes miscarriages, makes women “more likely to develop lethal infections” and “die a violent death.” Through these false claims, Vitter confuses people about the science of birth control and makes it harder for them to access much-needed health services.

How could it be that such a nominee would not be laughed out of the Senate chamber? Well, let’s not forget that it’s considerably easier now to pack the courts with unqualified nominees since the elimination of the filibuster for judicial nominees. Further, the Trump administration does not have a great track record of vetting judicial nominees, and the Senate’s willingness to set aside unqualified nominees has been non-existent: not a single Republican senator has voted against a single nominee.

This allowed President Trump to appoint four times as many judges in his first year as President Obama did in his. The American Bar Association gave Vitter’s nomination its lowest qualified rating.

Multiple scientific and public interest organizations urged the Senate to vote against Vitter’s nomination:

Governmental policy and decision-making should be informed by scientific evidence and the best available data. When hearing cases involving governmental policies or actions, judges must be able to evaluate evidence about harms and benefits in an independent and careful manner by evaluating the weight of the evidence. Failing to consider relevant, compelling evidence and placing inappropriate weight on poorly supported assertions should disqualify nominees from judicial appointments…

To merit confirmation, judges must exhibit an ability to appropriately weigh and contextualize scientific evidence when matters involving science are before them. Vitter’s misrepresentations of scientific evidence call into question her ability to do so appropriately. 

Judges need to be able to evaluate expert testimony and scientific evidence in an impartial way. How can we trust Vitter to appropriately evaluate evidence and expertise in a courtroom when she refuses to disavow the distribution of materials that distort the science on women’s health?

California Could Pass Innovative Legislation on Key Climate, Energy and Transportation Issues

UCS Blog - The Equation (text only) -

California State Capitol

California has a well-earned reputation as a world leader in promoting clean energy and other solutions to climate change. However, as anyone paying attention to the climate crisis knows, we have far more work to do. Fortunately, the California Legislature is considering many bills in 2018 that would further address climate change. With three and half months until the Legislature adjourns for the year, UCS is working with lawmakers to make progress on a suite of policy prescriptions to promote renewable energy, clean transportation, and better preparedness for climate change impacts.

Create a clean electricity system

California has made great progress adding renewable energy to the grid. To meet our climate goals, we must continue our clean energy momentum and work to reduce reliance on natural gas power plants. This year UCS is working to:

  • Establish a goal of 100 percent clean energy. Achieving 100 percent clean energy is an ambitious goal we must reach for to create a cleaner and healthier future, and to continue California’s tremendous momentum advancing clean energy.
  • Establish standards for California electricity providers to join a western regional electricity grid. UCS is working to help pass AB 813 (Holden) to prepare the ground for a regional grid that would make it easier and more cost-effective to integrate renewable energy by sharing electricity generation across a larger area.
  • Reduce reliance on natural gas power plants. California needs to study the fleet of natural gas power plants to create a strategy to reduce the use of natural gas electricity generation in an orderly, cost-effective, and equitable manner. In addition, UCS is supporting work to limit the use of the dirtiest natural gas power plants at times and in locations with bad air quality.
Create a clean transportation system

For decades California has led the nation with policies to reduce pollution from vehicles and promote clean fuel and vehicle technologies. As our transportation system faces dramatic changes in coming years—electrification, car-sharing, automation—we must ensure these changes result in reduced emissions and other key objectives (such as safety and accessibility). In 2018, UCS is working to:

  • Pass a state budget that includes much-needed incentives for electric cars, trucks, and buses. Incentives for electric cars vehicles are critical to overcome higher upfront costs that still exist and to increase consumer interest in this new technology. Each year lawmakers must appropriate funding for important incentive programs for light-duty and heavy-duty vehicles and UCS is working to make sure adequate funding is appropriated for the year ahead.
  • Ensure autonomous vehicles (AVs) reduce pollution and congestion and enhance access to mobility. AVs may become the most significant innovation in transportation since the mass introduction of automobiles early last century. However, public policy needs to guide the safe introduction of this emerging technology for widespread adoption of AVs to result in positive outcomes in the years ahead. UCS supports SB 936 (Allen), which will create an expert task force to make recommendations to provide guidance for how we can shape this new transportation technology to achieve these public benefits.
  • Increase use of electric vehicles by ride-hailing services. Ride hailing services—like Uber and Lyft—are a rapidly growing part of our transportation system. As these services grow and carry more and more passengers, it will become increasingly important that they move toward vehicle electrification to reduce pollution—just as electrification is important for personal vehicle use and transit buses alike. SB 1014 (Skinner) looks to address this issue. While UCS supports the concept of this bill, there are important details that remain to be resolved.
Better prepare California for a changing climate

California is facing a “new normal” of increasing variability and extremes in climate conditions with enormous impacts on people, communities, and the infrastructure on which our safety and economies depend. We must start to plan, design and build California’s infrastructure to be “climate-smart” and withstand the new reality of climate change. This year UCS is working to:

  • Create a state adaptation center to support decision-making on state infrastructure projects. The state should establish an office within the state government to provide various state agencies with actionable climate-related information and real-time guidance on specific analytical approaches and data choices as they grapple with decisions about planning and designing infrastructure projects.

I look forward to working on these and other issues on behalf of UCS and our supporters and Science Network members. Hopefully the Legislature will pass legislation advancing many of these priorities this year, keeping California on a path to a safe and sustainable future that utilizes science as a foundation for policy-making.

UCS Joins Lawsuit to Stop Pruitt from Rolling Back Clean Car Standards

UCS Blog - The Equation (text only) -

UCS joined a coalition of non-profit organizations in filing a lawsuit to challenge EPA Administrator Scott Pruitt’s attempt to roll back a regulation designed to improve vehicle gas mileage, save you money, and tackle transportation-related emissions, the biggest source of climate change pollution in the U.S.

A brief history of the fuel efficiency standards

This suit opens a new chapter in an epic saga that is longer than any George R.R. Martin or Robert Jordan series. Was this saga TL;DF (too long, didn’t follow)? Here is a brief primer.

In 2009 automakers agreed to a federal standard that requires them to gradually raise the average mpg of their vehicles through 2025. But, two days after President Trump took office, automakers and their trade groups asked the White House to weaken the standard. The Trump Administration agreed, and subsequently relied on bogus, industry-funded science to determine that the standards need to be changed.

How, exactly, the standards will be changed is TBD, but even before we allow EPA to get to that stage, UCS and our allies are asking a panel of federal judges to review the EPA decision to reexamine the standard. If the court finds that EPA’s decision to overlook the reams of science-based evidence that supported the original standard was improper, then EPA will have to go back to the drawing board and the current standard will remain intact.

Is this all the automakers fault?

In a word, yes. But blame must also be placed on the Trump Administration, which has turned this program into such a boondoggle that automakers have begun to change their tune and claim that EPA isn’t doing what they asked for. Don’t feel too bad for the automakers, though. They led a bull into a china shop and are now upset that the bull is destroying too much china.

What happens if EPA wins

This isn’t the final chance to stop the fuel efficiency standards from being destroyed. Even if EPA wins this case and the similar suit filed by 17 states and the District of Columbia, EPA still needs to submit an additional rulemaking for public comment that details exactly what the standards will be out to 2025. EPA will receive tens – if not hundreds – of thousands of comments in support of maintaining a strong standard, though it is unlikely they will listen to any of them. So, a weak rule will probably get finalized, which will prompt an opportunity for another lawsuit. That lawsuit will be the final crack at striking down what EPA is trying to do but, given how fast the federal government operates, won’t be initiated for quite some time.

In the interim, it’s important to keep pressure on EPA by having the judiciary rule on whether what they did was within the bounds of their authority. EPA ultimately chose to modify a standard that is based on the best available science, years of stakeholder input, and broad public support – and the small army of attorney’s representing the coalition of NGOs and states will make sure the court hears that argument loud and clear. It will also be important to submit comments to future EPA rulemakings on this issue – even if they don’t persuade the agency. An overwhelming number of comments in support of a strong rule clearly demonstrates how Americans view fuel efficiency standards, and can help a court find that EPA did not act in the public interest in weakening the standards.

Back to Bad Air: The Trump EPA’s Attack on Science and Our Health

UCS Blog - The Equation (text only) -

Smoggy skyline in Salt Lake City, Utah Photo: Eltiempo10/CC BY-SA 4.0 (Wikimedia)

Most Americans wake up and breathe comfortably every day because we’ve enjoyed decades of strong science-based clean air policies. These policies limit the emissions from cities, cars, factories and more to keep the air clean and free from most harmful air pollutants.

When he was first appointed, EPA Administrator Scott Pruitt vowed to bring the agency “back to basics” by focusing on clean air and water. One could be forgiven for assuming this meant he intended to preserve and strengthen America’s air pollution protections. That’s why it’s so jarring to see how severely his actions have undermined them. The Trump Administration’s EPA is working hard to unravel these life-saving protections on multiple fronts. This week, Administrator Pruitt and his air chief, William Wehrum, will testify on the Hill. They should be asked about how these actions bring EPA back to basics and fulfill its mission to protect public health and the environment.

More hazardous air pollutants with MACT rule change

In February, the EPA issued new guidance to weaken a policy that protects us from hazardous air pollutants from major sources like power plants and chemical manufacturing facilities. By repealing the “once in, always in” policy, the administration is allowing major polluters to evade using the maximum achievable control technologies (MACT) that have minimized our exposure to cancer-causing chemicals for years. Under the new guidance, at least 21 states could see increased emissions of pollutants like benzene and hydrochloric acid that can cause certain cancers and respiratory illnesses.

Gutting the science in ambient air pollutant decisions under NAAQS

Moreover, following up on a presidential memo last month, the EPA last week released guidance changing how the agency sets standards for ambient air pollutants like ozone, lead, and carbon monoxide. Together, the presidential memo and EPA guidance chip away at the long-standing science-based process that has effectively and drastically reduced ambient pollution in this country for decades.

Air pollution statistics cartoon

Changes at the EPA mean that the agency may soon have far less independent science feeding into its decisionmaking on air pollution protections.

The National Ambient Air Quality Standards (NAAQS) are a widely effective program that ensures the government sets standards for protecting clean air, based solely on what’s protective of public health. This has, by and large, allowed science and public health to prevail even in the face of political or commercial pressures. But the Trump administration has now opened the door to upending this process.

While the EPA guidance claims to “differentiate science and policy judgments,” it in fact does the opposite. Under the proposal, the EPA and its science advisors must not solely consider public health (as the law requires) but must elevate consideration of potential adverse impacts from setting a health-based standard, such as economic impacts. The process would be removed from EPA’s Office of Research and Development—where much of the agency’s scientific expertise lies—and the comprehensive document outlining the state of the science on pollutants and health that the administration relies on to make a science-based decision may be combined with a regulatory impact assessment, blurring the distinction between scientific and political judgments. This builds on a presidential memo that limited the kinds of scientific analyses the EPA can use when determining whether states are meeting the standard.

Restricting the science that EPA can use for decisionmaking

To put more salt in the wound, these actions come on the heels of the EPA’s recent, widely opposed, and dangerous, proposal to restrict the science that the agency can use to make rules. This proposal originated as a ploy by the tobacco industry to stave off second-hand smoke rules, and while its effects would be far broader than air pollution policy, protections against pollutants like ozone and particulate matter are clearly its main target.

Dwindling air pollution law enforcement

EPA enforcement of air pollution laws is also down. The agency issued only around half the average number of penalties against polluters in the first year of the Trump administration as in the same period of the past three presidential administrations.

Wrecking EPA’s science advisory committees

As if these things weren’t enough to undermine the EPA’s basic responsibilities, the administration also has worked to gut the agency’s science advisory committees, kicking academic experts off and replacing them with unqualified or deeply conflicted representatives. Industry representation on the EPA’s Science Advisory Board, for example, has tripled. The consequence will be far less independent science advice reaching EPA decisionmakers—and fewer checks on Pruitt’s ability to undo rules.

And we have some indications of the administration’s priorities here. In its proposed FY 2019 EPA budget, President Trump and Administrator Pruitt are looking to cut EPA funding that supports scientific research related to clean air by 27 percent.  Such a cut would threaten the ability of the EPA to monitor air quality levels, estimate population exposure to air pollutants, examine the effects of air pollution on public health, and reducing associated risks, and provide models, tools, and technical guidance to states. This clearly signals the administration’s disregard for air quality work at the EPA.

Administrator Pruitt’s biggest scandal

The sum of these policy changes is likely to mean dirtier air for all of us. This increased pollution is especially dangerous for the vulnerable groups who already disproportionately suffer from the harmful effects of air pollution. Children, the elderly, and those with lung diseases already face health challenges at current air pollution levels; weakening current standards will certainly exacerbate harm for these groups. Low-income neighborhoods and communities of color, which already experience disproportionate impacts from air pollution due to the cumulative impact of being near multiple pollution sources, will also be harmed by these policy changes.

Looking out for public health is supposed to be the “basic” responsibility of the EPA and its administrator.  The most scandalous thing about Scott Pruitt is how he’s abandoned the mission of the agency. If he won’t do the job, the rest of us need to speak up for clean air and the science that helps us protect it. Our lungs depend on it.

Photo: Eltiempo10/CC BY-SA 4.0 (Wikimedia)

SNAP is a boon to urban and rural economies—and small-town stores may not survive cuts

UCS Blog - The Equation (text only) -

In case you missed it, Congress is in the midst of a pretty major food fight. At the center of it is the Supplemental Nutrition Assistance Program (SNAP), which is the first line of defense against hunger for more than 21 million American households. Going forward, however, an estimated 2 million people stand to lose SNAP benefits if the farm bill proposal passed by the House Agriculture Committee last month becomes law. The bill’s draconian work requirements and eligibility changes threaten to upend the lives of some of the nation’s most vulnerable individuals and families. But it could also deliver a serious blow to the economic vitality of many rural and small-town communities, in an economic domino effect that often starts at the local grocery store.

Despite improvements in the national economy since the 2008 recession, rural communities across the United States continue to face economic uncertainty, and grocery stores are among the small-town businesses that are finding it hard to stay afloat. The challenges faced by rural food retailers are numerous: competition from increasingly powerful “big-box” stores, the rise of online retailers, and high operating costs are but a few of the challenges threatening the economic viability of today’s grocery stores. But there’s another major driver of food sales that impacts rural retailers and residents alike, and it has to do with how much families can afford to spend.

Many households in low-wage, low-prosperity rural counties turn to SNAP to augment their food budgets—in fact, they do so at higher rates than their urban counterparts. About 16 percent of households in rural or non-metro areas participate in the program, compared to 13 percent in metro areas. And in a recent analysis of publicly available data, UCS found that 136 of the 150 counties with the highest percentages of SNAP participation by household are located in rural areas.

We know the benefits that SNAP dollars bring to the people who use them. SNAP participation bolsters financial stability and food security; increases the likelihood that kids complete high school, while decreasing their risk of obesity and metabolic syndrome into adulthood; and saves about $1,400 in annual medical costs for low-income adults. But where do those dollars go next?

The ripple effect of SNAP spending

Following the path of a SNAP dollar can help us understand the invaluable role SNAP plays in supporting local industries and bolstering the broader economy. The graphic below shows the path of a dollar spent at a local grocery store.

It isn’t difficult to see how SNAP dollars are a boon to grocery stores—particularly for businesses in low-income areas, where SNAP purchases account for a greater share of sales. Dr. David Procter, Director of Kansas State University’s Center for Engagement and Community Development, knows a thing or two about the food economy. He’s been working with rural grocers for over a decade as part of the Rural Grocery Initiative, which helps small-town stores develop sustainable business models in the face of a food landscape that is rapidly changing.

“Small town grocery stores stand as a bulwark against the ever-rising number of rural Americans living in food deserts,” says Procter. “These food retail businesses are a vital element of the local food system, providing residents with access to produce, dairy, breads, grains, and meats. They are important to the local economy, creating jobs and generating tax revenue. Finally, these stores are community hubs, gathering places where social capital is built and maintained.”

Beyond the grocery checkout, SNAP dollars keep working

But the economic impact of SNAP doesn’t end at the store; in fact, this is only the beginning of a series of transactions that results in what is referred to as a “multiplier effect.” The standard USDA model estimates that, during a weak economy, $1 in SNAP spending generates about $1.80 in economic activity. This would mean that the $64.7 billion in SNAP benefits distributed in fiscal year 2017 could have generated an estimated $114 billion in economic activity, creating and supporting more than 567,000 jobs across the country.

So how does it work? Suppose the economy in Anytown, USA takes a turn for the worse. A factory relocates, or maybe a natural disaster shuts down the town’s major industry for an extended period of time. Many households find that they have less money to spend, and business at local establishments slows. Because of hardships resulting from the economic downturn—perhaps job loss, or reduced hours—some families apply for SNAP benefits. As those families use SNAP dollars to help put food on their tables, the grocery stores they shop at begin to recover. With more revenue, these stores can hire back staff; resume full operation and pay for operational costs like lighting and refrigeration; and, of course, purchase more food from farmers and distributors to meet growing demand. And as SNAP spending is propagated through the supply chain, each sector that gets a share of that additional money is able to spend more money in turn.

The effect extends to a wide range of sectors. Here’s why: studies have suggested that each additional dollar received in SNAP benefits results in between 26 and 60 additional cents spent on food—meaning an extra SNAP dollar received doesn’t equal an additional dollar spent on groceries. This is because, as many of us know, low-income households (including those experiencing temporary financial distress) are constantly making difficult decisions about how and when to pay for necessities such as housing, education, and transportation. And when SNAP benefits relieve some of the strain on a family’s food budget, they also help to free up a portion of income once spent on food for other expenses. From an economic standpoint, this means that a range of industries outside of the food supply chain also benefit indirectly—and not insignificantly—from SNAP spending. This multiplier effect shows how SNAP can effectively guide economic recovery.

Strong nutrition policy can help build strong communities and economies

This brings us back to the farm bill. If we want to preserve SNAP’s essential function as a safety net for our families and for our economy during tough times, we need to protect the program from the ill-advised overhaul making its way through Congress.

We can also leverage farm bill legislation to ensure that more of each SNAP dollar goes straight into farmers’ pockets and stays in our local communities. According to USDA models, the jobs that could have been created or supported by SNAP spending during fiscal year 2017 include nearly 50,000 agricultural jobs—a significant number, yet less than 9 percent of the estimated total. Many of the programs contained in the Local FARMS Act, a bipartisan marker bill introduced early in the farm bill process, offer win-win solutions that help farmers expand local and regional food sales while providing low-income populations with greater access to fresh, nutritious foods. Though these “tiny but mighty” programs represent a small fraction of the farm bill budget, they provide the means to effectively amplify the return on federal investment in programs like SNAP—which can make all the difference for families and rural farming communities that have been slower to recover from economic depression.

The full House is expected to take up H.R. 2 this week, with the possibility of introducing amendments before a final vote. UCS is working to ensure that members of Congress in both houses reject a SNAP overhaul, and instead take meaningful action to support low-income households in both rural and urban communities—while also giving a boost to small and midsize farmers. Got five minutes and want to make a difference? We’ve made it easy to call your members of Congress today to tell them to vote NO on H.R. 2.

La EPA elimina una protección vital para mantener el aire libre de sustancias tóxicas, poniendo nuestra salud en peligro

UCS Blog - The Equation (text only) -

View of the ship channel in Houston with city in the back, and air pollution.

Por décadas, la Ley de Aire Limpio nos ha protegido de los nocivos efectos a la salud que causan los contaminantes atmosféricos industriales. Muchos de estos contaminantes son tóxicos;  respirarlos o cualquier contacto con ellos puede causar cáncer, al igual que enfermedades respiratorias y neurológicas degenerativas que pueden causar la muerte. Algunas, como el cloro y el ácido hidroclorídrico, por ejemplo, pueden inflamar los pulmones y las vías respiratorias. El estireno, solvente utilizado con frecuencia en la elaboración de plásticos y hule sintético, está ligado a trastornos degenerativos como la esclerosis múltple y otras enfermedas similares al Parkinson. Gracias a las regulaciones que nos protegen de 187 sustancias tóxicas, la Agencia de Protección Ambiental (EPA en inglés) estima que se han evitado cada año desde 1990 la emisión de 1.5 millones de toneladas de contaminantes atmosféricos tóxicos.

Pero recientemente la EPA—sigilosamente—ha eliminado estas protecciones. A la extensa lista de los escándalos de corrupción, conflictos de interés, intervención política en la ciencia y nepotismo en la agencia, le añadimos la derogación de la política conocida como “once in, always in” (abreviada “OIAI”, y que se traduce “una vez presente, siempre presente”).

Este cambio, sin previo proceso de consulta pública y escuetamente anunciado como una “reinterpretación” de la ley, le permitirá a las instalaciones industriales altamente contaminantes, como las fundiciones metalúrgicas e instalaciones petroquímicas, eliminar el uso de tecnologías para controlar la contaminación tóxica que emiten al aire.  El uso de tecnologías y procesos para reducir contaminantes tóxicos se conocen como Maximum Achievable Control Technologies (MACT), y hasta hace poco fueron de uso obligatorio por parte de las instalaciones altamente contaminantes.

En una nota previa a completar nuestro estudio sobre las consecuencias de la derogación de ésta norma (aquí en inglés), advertí que aumentará las emisiones de contaminación causante de cáncer. Mi colega, la Dra. Gretchen Goldman, ya nos explicó (en inglés) que las comunidades de justicia ambiental–donde la mayoría de las personas son afroamericanos, latinos o pertenecientes a otras minorías étnicas  y/o de bajo ingreso—serían las más afectadas. En efecto, hemos encontrado en nuestro estudio que muchas de las comunidades donde ya existen altos niveles de contaminación tóxica se verán expuestas aún más.

Tomemos como primer ejemplo a las comunidades de Galena Park y Manchester aledañas al canal marítimo de Houston en Texas. Junto con nuestros colaboradores—vecinos de estos barrios y activistas de la organización TEJAS, quienes son en su mayoría latinos y muchos de bajos recursos—, hace poco demostramos que la cercanía a múltiples instalaciones industriales que al presente emiten muchos contaminantes tóxicos está teniendo efectos negativos en la salud de estas comunidades.

El distrito legislativo 29 (TX-29), donde están dichas comunidades, contiene 15 instalaciones que reducen sus emisiones de manera significativa mediante MACT. Con el cambio en la norma de la EPA, once de éstas pudieran emitir unas 205 toneladas de contaminantes atmosféricos tóxicos por año, lo cual representa un incremento de casi 70 por ciento.

Algunas fuentes mayores de contaminantes atmosféricos tóxicos como la fábrica de químicos Deer Park en Houston, TX (perteneciente a OxyChem), pudiera incrementar sus emisiones de contaminantes atmosféricos tóxicos de 0.64 a casi 25 toneladas por año si deja de utilizar MACT para controlar sus emisiones.

La nueva directriz impactará a los estados de formas distintas. Algunos estados dependen exclusivamente de las normas federales de contaminantes tóxicos para proteger la calidad de su aire, mientras otros estados establecen sus propios umbrales. Algunos de los estados con normas propias permiten las emisiones de contaminantes dependiendo del caso, mientras otros han establecido normas más estrictas en general.

¿Cómo puede usted conocer los posibles impactos en su región? Puede consultar el mapa interactivo que creamos donde mostramos el número de instalaciones que pudieran incrementar emisiones tóxicas en su distrito electoral. Por ejemplo, si selecciona el districto electoral 16 de Pensilvania (PA-16), podrá ver que once de las catorce instalaciones que al corriente usan MACT para reducir sus emisiones tóxicas pudieran emitir 209 toneladas por año, y que el estado no cuenta con protecciones adicionales para limitar contaminantes tóxicos.

Si desliza la ventana un poco hacia abajo podrá encontrar el nombre y número de teléfono de su representante. Le urgimos que lo contacte  para preguntarle cómo le exigirá a la EPA y a la agencia de calidad ambiental de su estado que protejan a la salud pública de este peligroso cambio.

¿Usted qué puede hacer?

Hay muchas maneras de expresar su preocupación sobre la posibilidad que las instalaciones industriales en su comunidad emitan contaminantes tóxicos del aire debido al debilitamiento de las protecciones existentes.

  • Si usted vive en un estado donde la contaminación tóxica del aire podría aumentar, presione a sus legisladores para que establezcan leyes estatales que protejan a su comunidad de estos contaminantes tóxicos. A continuación podrá encontrar algunas ideas para participar, y consejos para comunicarse con legisladores (enlace en inglés).
  • Pida un cita en persona con su representante o miembros de su equipo y comparta su preocupación.
  • Organice o participe en reuniones, cabildos abiertos, y otros eventos comunitarios. Aproveche el marco de las elecciones del 2018 en donde ocurrirán muchos de estos tipo de eventos y pida compromiso con este tema. Encuentre cabildos abiertos en este enlace  o en la página de su representante, y utlice esta guía para organizar un evento comunitario (en inglés).    
  • Pregúntele a la agencia responsable de la calidad del aire en su estado sobre cómo los cambios en “once in, always in” podrían afectar a su área. Encuentre su agencia estatal en la página de la EPA.
  • Utilice los medios de comunicación para atraer la atención del público sobre el tema. Escriba cartas al editor, editioriales, o reúnase con periodistas locales y juntas editoriales y comparta su preocupación. Lea estos consejos sobre cómo hacerlo (en inglés).
  • Contacte directamente a la compañía que opera la instalación industrial en su comunidad y pídales que se comprometan a mantener su clasificación y a utilizar la tecnología MACT con todos sus requisitos. Vuelva a contactarlos si no le responden en el plazo de una semana y comparta las respuestas, o los silencios, con medios locales, representantes y su comunidad.
  • Dígale a Scott Pruitt, director de la EPA, que cumpla su mandato de proteger la salud pública y al medioambiente y revoque la nueva directriz.
  • Envíele trinos en Twitter a Scott Pruitt, director de la EPA, y etiquete a sus representantes al congreso.

¿Quiere recibir la información más reciente sobre los ataques federales a nuestra salud, seguridad y protecciones ambientales, y notificaciones personalizadas sobre cómo usted puede defender la ciencia? Si tiene un posgrado, puede unirse a la Red de Científicos y su iniciativa de vigilancia (en inglés). Si usted es un líder local, únase a nuestro grupo de Science Champions (en inglés).

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs