Combined UCS Blogs

Northern Plains Drought Shows (Again) that Failing to Plan for Disasters = Planning to Fail

UCS Blog - The Equation (text only) -

As the dog days of summer wear on, the northern plains are really feeling the heat. Hot, dry weather has quickly turned into the nation’s worst current drought in Montana and the Dakotas, and drought conditions are slowly creeping south and east into the heart of the Corn Belt. Another year and another drought presents yet another opportunity to consider how smart public policies could make farmers and rural communities more resilient to these recurring events.

Let’s start with what’s happening on the ground: Throughout the spring and early summer, much of the western United States has been dry, receiving less than half of normal rainfall levels. And the hardest hit is North Dakota. As of last week, 94 percent of the state was experiencing some level of abnormally dry conditions or drought, with over a quarter of the state in severe or extreme drought (a situation that only occurs 3 to 5 percent of the time, or once every 20 to 30 years).

Throughout the spring and early summer, drought conditions have worsened across the Dakotas and Montana, stressing crops and livestock.
Image: http://droughtmonitor.unl.edu/

But this drought is not just about a dry spring. Experts believe the problem started last fall when first freeze dates were several weeks later than usual, creating a “bonus” growing period for crops like winter wheat and pasture grasses, which drew more water from the soil. This is an important pattern for agriculture to stay tuned into, as recent temperature trends point to greater warming conditions in the winter.

Bad news for wheat farmers (and bread eaters)

The timing of the drought is particularly damaging to this region’s farm landscape, which centers around grasslands for grazing livestock, along with a mix of crops including wheat, corn, soy, and alfalfa.

Spring wheat has been especially hard hit—experts believe this is the worst crop in several decades in a region that produces more than 80 percent of the country’s spring wheat. (Here’s a great map of the wheat varieties grown across the country, which makes it easy to see that the bread and pasta products we count on come from Montana and the Dakotas).

As grasses wither, cattle ranchers have only bad options

More than 60 percent of the region’s pasture grasses are also in poor or very poor condition, leaving cattle without enough to eat. Given the forecast of high temperatures upcoming, and the creeping dry conditions into parts of the Corn Belt (at a time of year when corn is particularly sensitive to hot and dry conditions), it is shaping up to be a difficult situation for farmers and  ranchers all around the region.

So it’s appropriate that the Secretary of Agriculture released a disaster proclamation in late June, allowing affected regions to apply for emergency loans. But another of the Secretary’s solutions for ranchers with hungry livestock—authorizing “emergency grazing” (and just this week) “emergency haying” on grasslands and wetlands designated off-limits to agriculture—could exacerbate another problem.

Short-term emergencies can hurt our ability to plan for the long-term

The Conservation Reserve Program (CRP), created by the 1985 Farm Bill, pays landowners a rental fee to keep environmentally sensitive lands out of agricultural production, generally for 10-15 years. It also serves to protect well-managed grazing lands as well as to provide additional acres for grazing during emergencies such as drought.

Instead of planting crops on these acres, farmers plant a variety of native grasses and tree species well suited to provide flood protection, wildlife and pollinator habitat, and erosion prevention. In 2016, almost 24 million acres across the United States (an area roughly the size of Indiana) were enrolled in CRP. This included 1.5 million acres in North Dakota, which represents approximately 4 percent of the state’s agricultural land.

While this might sound like a lot, CRP numbers across the country are down, and in fact North Dakota has lost half of its CRP acreage since 2007. This is due in part to Congress  imposing caps on the overall acreage allowed in the program, but in large part due to the historically high commodity prices over the same time period, as well as increased demand for corn-based ethanol.

The loss of CRP acreage over the last decade demonstrates high concentrations of land conversion in the Northern Plains, nearly overlapping with the current drought. Image: USDA Farm Service Agency

Research on crop trends tells a complicated story about how effective this program is at protecting these sensitive lands in the long-term. The data demonstrate how grasslands, notably CRP acreage, are being lost rapidly across the United States. CRP acreage often comes back into crop production when leases expire (see examples of this excellent research here, here and finally here, which notes that often CRP lands turn into corn or soy fields). This may potentially erase the environmental benefits from these lands that were set aside.

At the same time, with negotiations toward a new Farm Bill underway, some ranchers and lawmakers are looking for even more “flexibility” in the CRP program. Some have expressed concerns about the amount of land capped for CRP. Some feel that CRP rental rates are too high, tying up the limited suitable land that young farmers need to get started, while others believe there are not enough new contracts accepted (for things like wildlife habitat) because of caps.

The bottom line is that it is critical to have emergency plans in place to protect producers in cases of drought. Emergency livestock grazing on CRP acreage is one solution to help prevent ranchers from selling off their herds (such sell-offs are already being reported). But, if CRP acreage continues to decline, what will happen when the next drought occurs, or if this drought turns into a multi-year disaster? And what will happen if floods hit the region next year, and the grasslands that could help protect against that emergency aren’t there?

Unfortunately, short-term emergencies can hurt our ability to plan for long term, and the trend toward losing CRP and grasslands is one example of this. It is no simple balance for policy to find solutions that simultaneously support short-term needs while encouraging risk reduction in the long term.

Agroecology helps farmers protect land while still farming it

But there’s another way to achieve conservation goals that doesn’t depend upon setting land aside. A number of existing farm bill programs encourage farmers to use practices on working lands that build healthier soils to retain more water, buffering fields from both drought and flood events. Increasing investment and strengthening elements of these programs is an effective way to help farmers and ranchers build long-term resilience.

Recent research from USDA scientists in the Northern Plains highlights climate change impacts and adaptation options for the region, and their proposed solution sound much like the agroecological practices UCS advocates for: increased cropping intensity and cover crops to protect the soil, more perennial forages, integrated crop and livestock systems, as well as economic approaches that support such diversification and the extension and education services needed to bring research to farmers.

As I wrote last year, drought experts recognize that proactive planning is critical, thinking ahead about how disasters can be best managed through activities such as rainfall monitoring, grazing plans, and water management is critical. Here we are again with another drought, and climate projections tell us that things are likely to get worse. In this year as a new Farm Bill is being negotiated, we have an opportunity to think long-term and make investments for the future to better manage future drought.

 

As Coal Stumbles, Wind Power Takes Off in Wyoming

UCS Blog - The Equation (text only) -

After several years of mostly sitting on the sidelines, Wyoming is re-entering the wind power race in a big way. Rocky Mountain Power recently announced plans to invest $3.5 billion in new wind and transmission over the next three years. This development—combined with the long-awaited start of construction on what could be the nation’s largest wind project—will put Wyoming among the wind power leaders in the region. That’s welcome news for a state economy looking to rebound from the effects of the declining coal industry.

Capitalizing on untapped potential

Wyoming has some of the best wind resources in the country. The state ranks fifth nationally in total technical potential, but no other state has stronger Class 6 and 7 wind resources (considered the best of the best). And yet, wind development has remained largely stagnant in Wyoming since 2010.

In the last seven years, just one 80-megawatt wind project came online in Wyoming as the wind industry boomed elsewhere—more than doubling the installed US wind capacity to 84,000 megawatts.

Fortunately, it appears that Wyoming is ready to once again join the wind power bonanza, bringing a much-needed economic boost along with it. On June 29th, Rocky Mountain Power—Wyoming largest power provider—filed a request with regulators for approval to make major new investments in wind power and transmission. The plan includes upgrading the company’s existing wind turbines and adding up to 1,100 MWs of new wind projects by 2020, nearly doubling the state’s current wind capacity.

In addition to the $3.5 billion in new investments, Rocky Mountain Power estimates that the plan will support up to 1,600 construction jobs and generate as much as $15 million annually in wind and property tax revenues (on top of the $120 million in construction-related tax revenue) to help support vital public services. What’s more—thanks to the economic competitiveness of wind power—these investments will save consumers money, according to the utility.

Rocky Mountain Power isn’t the only company making a big investment in Wyoming’s rich wind resources. After more than a decade in development, the Power Company of Wyoming (PCW) has begun initial construction on the first of the two-phase Chokecherry and Sierra Madre wind project, which will ultimately add 3,000 MW of wind capacity in Carbon County. The $5 billion project expects to support 114 permanent jobs when completed, and hundreds more during the 3-year construction period. PCW also projects that over the first 20 years of operation, the massive project will spur about $780 million in total tax revenues for local and state coffers.

Diversifying Wyoming’s economy with wind

When completed, these two new wind investments will catapult Wyoming to the upper tier of leaders in wind development in the west and nationally. And combined with Wyoming’s existing wind capacity, the total annual output from all wind projects could supply nearly all of Wyoming’s electricity needs, if all the generation was consumed in state. That’s not likely to happen though, as much of the generation from the Chokecherry and Sierra Madre project is expected to be exported to other western states with much greater energy demands.

Still, the wind industry is now riding a major new wave of clean energy momentum in a state better known for its coal production.

Coal mining is a major contributor to Wyoming’s economy, as more than 40 percent of all coal produced in the US comes from the state’s Powder River Basin. But coal production has fallen in recent years as more and more coal plants retire and the nation transitions to cleaner, more affordable sources of power. In 2016, Wyoming coal production dropped by 20 percent compared with the previous year, hitting a nearly 20-year low. That resulted in hundreds of layoffs and confounded the state’s efforts to climb out of a long-term economic slump.  And while production has rebounded some this year, many analysts project the slide to continue over the long-term.

Of course, Wyoming’s recent wind power investments and their substantial benefits alone can’t replace all its losses from the coal industry’s decline. But a growing wind industry can offset some of the damage and play an important role in diversifying Wyoming’s fossil-fuel dependent economy. In fact, Goldwind Americas, the US affiliate of a large Chinese wind turbine manufacturer, recently launched a free training program to unemployed coal miners in Wyoming who want to become wind turbine technicians.

A growing wind industry can also provide a whole new export market for the state as more and more utilities, corporations, institutions and individual consumers throughout the west want access to a clean, affordable, reliable and carbon-free power supply.

Sustaining the momentum

As the wind industry tries to build on its gains in Wyoming, what’s not clear today is whether the state legislature will help foster more growth or stand in the way. In the past year, clean energy opponents in the Wyoming legislature have made several attempts to stymie development, including by significantly increasing an existing modest tax on wind production (Wyoming is the only state in the country that taxes wind production) and penalizing utilities that supply wind and solar to Wyoming consumers. Ultimately, wiser minds prevailed and these efforts were soundly defeated.

That’s good news for all residents of Wyoming. Wind power has the potential to boost the economy and provide consumers with clean and affordable power. Now that the wind industry has returned to Wyoming, the state should do everything it can to keep it there.

Photo: Flickr, Wyoming_Jackrabbit

The San Francisco Bay Area Faces Sea Level Rise and Chronic Inundation

UCS Blog - The Equation (text only) -

Looking across the San Francisco Bay at the city’s rapidly rising skyscrapers, it’s easy to see why Ingrid Ballman and her husband chose to move to the town of Alameda from San Francisco after their son was born. With streets lined with single family bungalows painted in a rainbow of pastel colors and restaurant patios lined with senior citizens watching pelicans hunt offshore, Alameda is a world away from the gigabits per second pace of life across the bay.

Children playing along Alameda’s Crown Memorial State Beach along San Francisco Bay. An idyllic place to play, the California State Department of Parks and Recreation describes the beach as “a great achievement of landscaping and engineering,” a description that applies to much of the Bay Area’s waterfront.

“I had a little boy and it’s a very nice place to raise a child–very family-oriented, the schools are great. And we didn’t think much about any other location than Alameda,” Ballman says. Alameda has been, by Bay Area standards, relatively affordable, though with median home  prices there more than doubling in the last 15 years, this is becoming less the case.

After Ballman and her husband bought their home she began to think more about the island’s future. “At some point,” she says carefully, “it really became clear that we had picked one of the worst locations” in the Bay Area.

A hotspot of environmental risk

The City of Alameda is located on two islands…sort of. Alameda Island, the larger of the two, is truly an island, but it only became so in 1902 when swamps along its southeastern tip were dredged and the Oakland Estuary was created. Bay Farm Island, the smaller of the two, used to be an island, but reclamation of the surrounding marshes has turned it into a peninsula that is connected to the mainland. In the 1950s, Americans flocked to suburbs in search of the American Dream of a house with a white picket fence and 2.5 children, and Alameda Island, home to a naval base and with little space for new housing, responded by filling in marshes, creating 350 additional acres. Bay Farm Island was also expanded with fill to extend the island farther out into the bay.

The filling of areas of San Francisco Bay was common until the late 1960s, when the Bay Conservation and Development Commission was founded.

Many Bay Area communities are built on what used to be marsh land. These low-lying areas are particularly susceptible to sea level rise and coastal flooding. Alameda  Island is circled in red, Bay Farm Island is just to the south.

While many former wetland areas are slated for restoration, many others now house neighborhoods, businesses, and schools, and are among the Bay Area’s more affordable places to live. The median rent for an apartment in parts of San Mateo and Alameda Counties where fill has been extensive can be half what it is in San Francisco’s bedrock-rooted neighborhoods.

When Bay Area residents think about natural hazards, many of us think first of earthquakes. In Alameda, Ballman notes, the underlying geology makes the parts of the island that are built on fill highly susceptible to liquefaction during earthquakes. It is precisely this same geology that places communities built on former wetlands in the crosshairs of a growing environmental problem: chronic flooding due to sea level rise.

Chronic inundation in the Bay Area

Ballman studies a map I brought showing the extent of chronic inundation in Alameda with a moderate sea level rise scenario that projects about 4 feet of sea level rise by the end of the century. The map is a snapshot from UCS’s latest national-scale analysis of community-level exposure to sea level rise.

“Right here is my son’s school,” she says, pointing to a 12-acre parcel of land that’s almost completely inundated on my map. With this moderate scenario, the school buildings are safe and it’s mostly athletic fields that are frequently flooded.

I haven’t brought along a map of chronic inundation with a high sea level rise scenario–about 6.5 feet of sea level rise by 2100–for Ballman to react to, but with a faster rate of sea level rise, her son’s school buildings would flood, on average, every other week by the end of the century. While this scenario seems far off, it’s within the lifetime of Ingrid’s son. And problems may well start sooner.

Seas are rising more slowly on the West Coast than on much of the East and Gulf Coasts, which means that most California communities will have more time to plan their response to sea level rise than many communities along the Atlantic coast. Indeed, by 2060, when the East and Gulf Coasts have a combined 270 to 360 communities where 10% or more of the usable land is chronically inundated, the West Coast has only 2 or 3. Given how densely populated the Bay Area is, however, even small changes in the reach of the tides can affect many people.

As early as 2035 with an intermediate sea level rise scenario, neighborhoods all around the Bay Area–on Bay Farm Island, Alameda, Redwood Shores, Sunnyvale, Alviso, Corte Madera, and Larkspur– would experience flooding 26 times per year or more—UCS’s threshold for chronic flooding–with a moderate scenario. By 2060, the number of affected neighborhoods grows to include Oakland, Milpitas, Palo Alto, East Palo Alto, and others along the corridor between San Francisco and Silicon Valley.

By 2100, the map of chronically inundated areas around the Bay nearly mirrors the map of the areas that were historically wetlands.

By 2100, with an intermediate sea level rise scenario, many Bay Area neighborhoods would experience flooding 26 times or more per year. Many of these chronically inundated areas were originally tidal wetlands.

Affordable housing in Alameda

Like many Bay Area communities, Alameda has struggled to keep up with the demand for housing–particularly housing that is affordable to low- and middle-income families–as the population of the region has grown. In the past 10-15 years, large stretches of the northwestern shore of the island have been developed with apartment and condo complexes.

Driving by the latest developments and glancing down at my map of future chronic inundation zones, I was struck by the overlap. With a high scenario, neighborhoods only 10-15 years old would be flooding regularly by 2060. The main thoroughfares surrounding some of the latest developments would flood by the end of the century.

While the addition of housing units in the Bay Area is needed to alleviate the region’s growing housing crisis, one has to wonder how long the homes being built today will be viable places to live. None of this is lost on Ballman who states, simply, “There are hundreds of people moving to places that are going to be underwater.”

Many of Alameda’s newly developed neighborhoods would face frequent flooding in the second half of the century with intermediate or high rates of sea level rise.

“Some of the more affordable places to live,” says Andy Gunther of the Bay Area Ecosystems Climate Consortium, “are the places that are most vulnerable to sea level rise, including Pinole, East Palo Alto, and West Oakland.” Many of these communities that are highly exposed to sea level rise are low-income communities of color that are already suffering from a lack of investment. These communities have fewer resources at their disposal to cope with issues like chronic flooding.

Bay Area action on sea level rise

How neighborhoods–from the most affordable to the most expensive–throughout the Bay Area fare in the face of rising seas will depend, in part, on local, state, and federal policies designed to address climate resilience. A good first step would be to halt development in places that are projected to be chronically inundated within our lifetimes.

For Bay Area and other Pacific Coast communities that will experience chronic inundation in the coming decades, there is a silver lining: For many, there is time to plan for a threat that is several decades away, compared to communities on the Atlantic Coast that have only 20 or 30 years. And California is known for its environmental leadership, which has led to what Gunther calls an “incredible patchwork” of sea level rise adaptation measures.

Here are some of the many pieces of this growing patchwork quilt of adaptation measures:

In South San Francisco Bay, a number of shoreline protection projects have been proposed or are underway.

A regional response to sea level rise

Gunther notes that “We’re still struggling with what to do, but the state, cities, counties, and special districts are all engaged” on the issue of sea level rise. With hundreds of coastal communities nationwide facing chronic flooding that, in the coming decades, will necessitate transformative changes to the way we live along the coast, regional coordination, while challenging will be critical. Otherwise, communities with fewer resources to adapt to rising seas risk getting left behind.

“There’s a regional response to sea level rise that’s emerging,” says Gunther, and the recently passed ballot measure AA may be among the first indicators of that regional response.

In 2016, voters from the nine counties surrounding San Francisco Bay approved measure AA, which focuses on restoring the bay’s wetlands. Gunther says that this $500+ million effort could prove to be “one of the most visionary flood protection efforts of our time.” The passage of Measure AA was particularly notable in that it constituted a mandate from not one community or one county, but all nine counties in the Bay Area.

Toward a sustainable Bay Area

Waves of people have rushed in and out of the Bay Area for over 150 years, seeking fortunes here, then moving on as industries change. The stunning landscape leaves an indelible mark on all of us, just as we have left a mark on it, forever altering the shoreline and ecosystems of the bay.

For those of us, like Ingrid Ballman and like me, who have made our homes and are watching our children grow here, the reality that we cannot feasibly protect every home, every stretch of the bay’s vast coastline, is sobering. All around the bay, incredible efforts are underway to make Bay Area communities safer, more flood-resilient places to live. Harnessing that energy at the regional and state levels, and continuing to advocate for strong federal resilience-building frameworks has the potential to make the Bay Area a place we can continue to live for a long time, and a leader in the century of sea level rise adaptation that our nation is entering.

Spanish Translation (En español)

Pengrin/Flickr San Francisco Bay Joint Venture Union of Concerned Scientists Kristy Dahl San Francisco Estuary Institute and the Bay Area Ecosystems Climate Change Consortium.

El área de la bahía de San Francisco enfrenta aumento del nivel del mar e inundación crónica

UCS Blog - The Equation (text only) -

Cuando desde el lado de la bahía uno ve los rascacielos de San Francisco multiplicarse a paso frenético, es fácil entender por qué Ingrid Ballman y su esposo eligieron mudarse de la ciudad hacia Alameda después del nacimiento de su hijo. Con búngalos unifamiliares pintados en un arco iris de colores pastel y restaurantes con patios en donde adultos mayores pasan el tiempo mirando a los pelícanos pescar, Alameda es un mundo de diferencia entre el ritmo de ‘gigabits’ por segundo de San Francisco y la vida del otro lado de la bahía.

Niños jugando en la playa estatal Crown Memorial de Alameda, en la bahía de San Francisco. Un lugar de ensueño para jugar, el Departamento de Parques y Recreación del Estado de California describe la playa como “un gran logro de paisajismo e ingeniería”, descripción que se aplica a la mayor parte de la costa del área de la bahía.

“Tuve un niño y es un lugar agradable para criarlo, muy orientado a la familia, las escuelas son buenas. No pensamos mucho en ningún otro lugar más que Alameda”, dice Ballman. Alameda ha sido, para los estándares del área de la bahía, relativamente económica aunque con el promedio de los precios de las casas, que han subido más del doble en 15 años, esto es cada vez menos el caso.

Después de que Ballman y su esposo compraron su casa, ella comenzó a pensar más en el futuro de la isla. “Hasta cierto punto”, dice cuidadosamente, “realmente está claro que escogimos una de las peores ubicaciones” del área de la bahía de San Francisco.

Un punto estratégico de riesgo

La ciudad de Alameda está situada en dos islas…más o menos. La isla de Alameda, la más grande de las dos, es verdaderamente una isla, lo ha sido desde el 1902 cuando unos pantanos a lo largo de la punta sureste fueron dragados para crear el estuario de Oakland.

La isla de Bay Farm, la más pequeña de las dos, solía ser una isla, pero la recuperación de los humedales la convirtió en una península conectada a tierra firme. En los años cincuenta, cuando familias enteras migraron a los suburbios en busca del sueño americano (una casa con una cerca blanca y 2 hijos y medio), la isla de Alameda, con su base naval, contaba con poco espacio para construir nuevas viviendas.

La solución al influjo de población fue rellenar los humedales para crear 350 acres adicionales. La isla de Bay Farm también usó rellenos para ampliar la isla hacia la bahía.

El relleno de áreas de la bahía de San Francisco fue común hasta finales de los años sesenta, cuando fue fundada la Comisión para la Conservación y Desarrollo de la Bahía.

Muchas comunidades del área de la bahía están construidas sobre humedales Estas áreas bajas son particularmente susceptibles al aumento del nivel del mar e inundación costera.

Mientras que existen programas para recuperar humedales en muchas áreas, muchas otras zonas de relleno son hoy barrios establecidos, con negocios y escuelas, que son más económicos para vivir que otras zonas. La renta promedio de un apartamento en partes de los condados de San Mateo y Alameda, donde el relleno ha sido extensivo, puede valer la mitad que en los barrios de tierra firme de San Francisco.

Mientras de los residentes del área de la bahía consideran peligros ambientales, muchos de nosotros pensamos primero en los terremotos. En Alameda, hace notar Ballman, el terreno geológico hace que partes de la isla que fueron rellenadas sean altamente susceptibles a licuefacción (o pérdida de la firmeza del suelo) cuando hay terremotos. Es precisamente esta misma geología la que pone a estas comunidades, que fueron construidas sobre antiguos humedales, en la mira de un creciente problema ambiental: inundaciones crónicas ocasionadas por el aumento del nivel del mar.

Inundación crónica en el área de la bahía

Ballman estudia el mapa que traje que muestra la extensión de las inundaciones crónicas en Alameda en el año 2100 teniendo en cuenta un escenario intermedio en el aumento del nivel del mar que proyecta un incremento de 4 pies comparado al nivel actual. El mapa es una muestra del último análisis, a escala nacional, de UCS que muestra los riesgos que enfrentan las comunidades del país con el aumento del nivel del mar.

“Aquí está la escuela de mi hijo”, dice apuntando hacia una parcela de 12 acres de tierra que aparece casi completamente inundada en mi mapa. Con este escenario intermedio los edificios de la escuela están a salvo y son principalmente los campos deportivos los que se inundan frecuentemente.

En esta ocasión, no traje un mapa de inundaciones crónicas con un escenario alto que proyecta un aumento de 6.5 pies del nivel del mar para finales de siglo para que Ballman lo viera. Ese mapa muestra que si no logramos reducir las emisiones, y continuamos al mismo paso de aumento del nivel del mar, para finales de siglo los edificios de la escuela de su hijo se inundarán, en promedio, cada dos semanas. Aunque este escenario parece lejano, el hijo de Íngrid vivirá para ese entonces. Más aún, estos impactos podrían adelantarse.

El nivel del mar está aumentando más lentamente en la costa oeste que en la mayor parte de las costas este y del Golfo, lo cual significa que la mayoría de las comunidades californianas tendrán más tiempo para planear su respuesta ante el aumento del nivel del mar.

Ciertamente, para el año 2060, cuando las costas del este y del Golfo cuenten con 270 a 360 comunidades donde el 10% o más de la tierra utilizable se inunda crónicamente, la costa del oeste solamente tendrá 2 o 3. Dado que el área de la bahía está densamente poblada, sin embargo, aún pequeños cambios en el alcance de las mareas podría afectar a mucha gente.

Tan pronto como el año 2035, teniendo en cuenta un escenario intermedio del aumento del nivel del mar, los barrios de la isla Bay Farm, Alameda, Redwood Shores, Sunnyvale, Alviso, Corte Madera y Larkspur vivirán inundaciones 26 veces al año o más (este es el umbral que ha definido UCS para catalogar las áreas que sufren inundaciones crónicas).

Para el año 2060, el número de barrios afectados ascendería hasta incluir Palo Alto, East Palo Alto y otras zonas a lo largo del corredor entre San Francisco y Silicon Valley.

Para el año 2100, el mapa de áreas crónicamente inundadas alrededor de la bahía es muy parecido al mapa de áreas que previamente fueron humedales.

Para el año 2100, en un escenario intermedio del aumento del nivel de mar, muchos vecindarios del área de la bahía experimentarían inundaciones 26 veces o más al año. Muchas de estas áreas crónicamente inundadas fueron originalmente humedales de mareas.

Vivienda asequible en Alameda

Como en muchas otras comunidades del área de la bahía, Alameda ha luchado para mantenerse a la altura de la demanda de la vivienda, particularmente la vivienda asequible para familias de ingresos bajos y medios, ante el crecimiento de la población en la región.

En los últimos 10 a 15 años, en grandes extensiones de la costa del noroeste de la isla se han desarrollado complejos de apartamentos y condominios. Conduciendo por las últimas construcciones y echando un vistazo a mi mapa de futuras zonas de inundaciones crónicas, me sentí perpleja por la superposición.

En un escenario alto, los barrios construidos solamente hace 10 o 15 años se inundarán regularmente para el año 2060. En este mismo escenario, para final de siglo, las vías principales que rodean algunos de las últimas construcciones se inundarán.

A pesar de que es necesario construir más unidades residenciales en el área de la bahía para aliviar la creciente crisis de vivienda, uno se pregunta, ¿cuánto tiempo serán lugares viables para vivir las casas hoy en construcción? Ballman entiende la magnitud del problema y dice simplemente, “hay cientos de personas mudándose a lugares que estarán bajo el agua”.

Muchos de los barrios recién construidos en Alameda enfrentarían frecuentes inundaciones en la segunda mitad del siglo con índices intermedios o altos de aumento del nivel de mar.

“Algunos de los lugares más accesibles para vivir”, dice Andy Gunther de Bay Area Ecosystems Climate Consortium, “son los lugares más vulnerables al aumento del nivel del mar, incluyendo Pinole, East Palo Alto y West Oakland”. Muchas de estas comunidades son comunidades de bajos recursos pertenecientes a minorías étnicas y raciales, que tienen que lidiar con la falta de inversión en sus barrios, y quienes, por lo tanto, tendrán menos recursos para enfrentar el aumento del nivel del mar.

Medidas adoptada por la bahía de San Francisco con miras al aumento del nivel de mar

Cómo le vaya a los barrios del área de la bahía, tanto al más barato como al más caro, ante el aumento del nivel del mar dependerá en parte de las políticas locales, estatales y federales diseñadas para enfrentar el cambio climático. Un buen primer paso sería detener las construcciones en lugares en los que se proyecta estarán crónicamente inundados en el transcurso de nuestras vidas

Pero para las comunidades de la bahía hay buenas noticias en medio de la adversidad: muchas tienen décadas para planear como enfrentarán los cambios venideros mientras las comunidades del Golfo y de la costa Atlántica tan solo cuentan con 20 o 30 años para tomar estas decisiones. California es conocido por su liderazgo en el medioambiente, que ha conducido a lo que Gunther llama “un increíble mosaico” de medidas de adaptación ante el aumento del nivel del mar.

Aquí tenemos algunas de las muchas piezas del creciente trabajo del “mosaico” de medidas de adaptación:

  • El año pasado, cuando la ley 2800 fue aprobada, el Gobernador de California Jerry Brown creó el Climate-Safe Infrastructure Working Group que busca integrar un rango de posibles escenarios climáticos al diseño y planeación de infraestructura.
  • La ciudad de San Francisco desarrolló guías para planear la ciudad pensando en el aumento del nivel del mar.
  • Con subsidio de la Agencia de Protección al Medioambiente (EPA, por sus siglas en inglés) el Novato Watershed Program está aprovechando los procesos naturales para reducir los riesgos de inundaciones a lo largo de Novato Creek.
  • El Instituto del Estuario de San Francisco (SFEI, por sus siglas en inglés) está trabajando para entender la historia natural de San Francisquito Creek, cerca de Palo Alto, y de East Palo Alto con la finalidad de desarrollar estructuras de control de inundaciones y metas de restauración funcionales y sostenibles.
  • El Santa Clara Valley Water District está programado para empezar a trabajar este verano para mejorar el drenaje de los canales del este y el oeste de Sunnyvale propensos a inundaciones y a reducir los riesgos de inundaciones en 1,600 hogares. El distrito también está abordando los problemas de inundaciones por mareas en cooperación con el Cuerpo de Ingenieros del Ejército de los Estados Unidos.
  • Como parte de sus esfuerzos para afrontar el aumento del nivel del mar, el condado de San Mateo instaló visores de realidad virtual a lo largo de las orillas del mar para involucrar al público en una discusión sobre cómo el aumento del nivel del mar afectaría su comunidad.
  • A nivel regional, la Bay Conservation Development Commission en colaboración con la Administración Nacional Oceánica y Atmosférica (NOOA, por sus siglas en inglés) y otras agencias locales, estatales y federales para el proyecto Adapting to Rising Tides que proporciona información, herramientas y orientación para organizaciones que buscan soluciones a los restos que trae el cambio climático.
  • La competencia Resilient by Design para el área de la bahía reúne a ingenieros, miembros de la comunidad y diseñadores quienes conjuntamente diseñan soluciones para enfrentar las consecuencias del aumento del nivel del mar.

En el sur de la bahía de San Francisco se han propuesto o están en marcha un número de proyectos para la protección de la costa. Fuentes: Instituto del Estuario de San Francisco y Consorcio de Cambio Climático de los Ecosistemas del Área de la Bahía..

Respuesta regional ante el aumento del nivel del mar

Gunther menciona que en el tema del aumento del nivel del mar “aún estamos lidiando con lo que hay que hacer, pero todas las ciudades, estados, condados y distritos especiales están comprometidos”. Con cientos de comunidades costeras en el país enfrentando inundaciones crónicas, en las próximas décadas necesitarán cambios transformadores en la forma de vida costera, coordinación regional, junto con compromiso estatal y federal, serán críticos para abordar los difíciles retos por venir. De otra forma, las comunidades con menos recursos para adaptarse a los riesgos del aumento del nivel del mar se arriesgarán a quedarse atrás.

“Está emergiendo una respuesta regional ante el aumento del nivel del mar”, dice Gunther, y la medida AA recientemente aprobada por votación puede estar entre los primeros indicadores de respuesta regional. En el año 2016, los votantes de los nueve condados que rodean la bahía de San Francisco aprobaron la medida AA, que se enfoca en la restauración de los humedales de la bahía.

Gunther dice que estos esfuerzos de más de $500 millones de dólares podrían probar ser “uno de los esfuerzos más visionarios de protección a las inundaciones de nuestros tiempos”. La aprobación de la medida AA fue particularmente notable porque constituyó un mandato no de una comunidad o un condado, sino de nueve condados de la bahía.

Hacia una área de la bahía sostenible

Por más de 150 años, oleadas de personas han venido a la bahía buscando fortuna, y luego han partido con el cambio de las industrias. El maravilloso paisaje deja una marca imborrable en todos nosotros, así como nosotros hemos dejado una marca en él, alterando por siempre la costa y los ecosistemas de la bahía.

Para aquellos de nosotros, como Ingrid Ballman y como yo, quienes hemos echado raíces y estamos viendo crecer a nuestros hijos aquí, la realidad de que no podemos de forma viable proteger cada casa ni cada tramo de la vasta costa de la bahía da que pensar.

A través de toda la bahía van en camino esfuerzos increíbles para hacer que las comunidades sean lugares más seguros y más resistentes para vivir. Aprovechar esa energía a niveles regional y estatal y continuar haciendo cabildeo para solidificar fuertes marcos de resistencia federales, tiene el potencial de hacer del área de la bahía un lugar sostenible y un líder en el nuevo siglo de la adaptación al aumento del nivel del mar al que está entrando nuestra nación.

 

Pengrin/Flickr San Francisco Bay Joint Venture Union of Concerned Scientists Kristy Dahl San Francisco Estuary Institute and the Bay Area Ecosystems Climate Change Consortium.

Turkey Point: Fire and Explosion at the Nuclear Plant

UCS Blog - All Things Nuclear (text only) -

The Florida Power & Light Company’s Turkey Point Nuclear Generating Station about 20 miles south of Miami has two Westinghouse pressurized water reactors that began operating in the early 1970s. Built next to two fossil-fired generating units, Units 3 and 4 each add about 875 megawatts of nuclear-generated electricity to the power grid.

Both reactors hummed along at full power on the morning of Saturday, March 18, 2017, when problems arose.

The Event

At 11:07 am, a high energy arc flash (HEAF) in Cubicle 3AA06 of safety-related Bus 3A ignited a fire and caused an explosion. The explosion inside the small concrete-wall room (called Switchgear Room 3A) injured a worker and blew open Fire Door D070-3 into the adjacent room housing the safety-related Bus 3B (called Switchgear Room 3B.)

A second later, the Unit 3 reactor automatically tripped when Reactor Coolant Pump 3A stopped running. This motor-driven pump received its electrical power from Bus 3A. The HEAF event damaged Bus 3A, causing the reactor coolant pump to trip on under-voltage (i.e., less than the desired voltage of 4,160 volts.) The pump’s trip triggered the insertion of all control rods into the reactor core, terminating the nuclear chain reaction.

Another second later and Reactor Coolant Pumps 3B and 3C also stopped running. These motor-driven pumps received electricity from Bus 3B. The HEAF event should have been isolated to the Switchgear Room 3A, but the force of the explosion blew open the connecting fire door, allowing Bus 3B to also be affected. Reactor Coolant Pumps 3B and 3C tripped on under-frequency (i.e., alternating current electricity at too much less than the desired 60 cycles per second). Each Turkey Point unit has three Reactor Coolant Pumps that force the flow of water through the reactor core, out the reactor vessel to the steam generators where heat gets transferred to a secondary loop of water, and then back to the reactor vessel. With all three pumps turned off, the reactor core would be cooled by natural circulation. Natural circulation can remove small amounts of heat, but not larger amounts; hence, the reactor automatically shuts down when even one of its three Reactor Coolant Pumps is not running.

At shortly before 11:09 am, the operators in the control room received word about a fire in Switchgear Room 3A and the injured worker. The operators dispatched the plant’s fire brigade to the area. At 11:19 am, the operators declared an emergency due to a “Fire or Explosion Affecting the Operability of Plant Systems Required to Establish or Maintain Safe Shutdown.”

At 11:30 am, the fire brigade reported to the control room operators that there was no fire in either Switchgear Room 3A or 3B.

Complication #1

The Switchgear Building is shown on the right end of the Unit 3 turbine building. Switchgear Rooms 3A and 3B are located adjacent to each other within the Switchgear Building. The safety-related buses inside these rooms take 4,160 volt electricity from the main generator, the offsite power grid, or an EDG and supply it to safety equipment needed to protect workers and the public from transients and accidents. Buses 3A and 3B are fully redundant; either can power enough safety equipment to mitigate accidents.

Fig. 1 (Source: Nuclear Regulatory Commission)

To guard against a single file disabling both Bus 3A and Bus 3B despite their proximity, each switchgear room is designed as a 3-hour fire barrier. The floor, walls, and ceiling of the room are made from reinforced concrete. The opening between the rooms has a normally closed door with a 3-hour fire resistance rating.

Current regulatory requirements do not require the room to have blast resistant fire doors, unless the doors are within 3 feet of a potential explosive hazard. (I could give you three guesses why all the values are 3’s, but a correct guess would divulge one-third of nuclear power’s secrets.) Cubicle 3AA06 that experienced the HEAF event was 14.5 feet from the door.

Fire Door D070-3, presumably unaware that it was well outside the 3-feet danger zone, was blown open by the HEAF event. The opened door created the potential for one fire to disable Buses 3A and 3B, plunging the site into a station blackout. Fukushima reminded the world why it is best to stay out of the station blackout pool.

Complication #2

The HEAF event activated all eleven fire detectors in Switchgear Room 3A and activated both of the very early warning fire detectors in Switchgear Room 3B. Activation of these detectors sounded alarms at Fire Alarm Control Panel 3C286, which the operators acknowledged. These detectors comprise part of the plant’s fire detection and suppression systems intended to extinguish fires before they cause enough damage to undermine nuclear safety margins.

But workers failed to reset the detectors and restore them to service until 62 hours later. Bus 3B provided the only source of electricity to safety equipment after Bus 3A was damaged by the HEAF event. The plant’s fire protection program required that Switchgear Room 3B be protected by the full array of fire detectors or by a continuous fire watch (i.e., workers assigned to the area to immediately report signs of smoke or fire to the control room.) The fire detectors were out-of-service for 62 hours after the HEAF event and the continuous fire watches were put in place late.

Workers were in Switchgear Room 3B for nearly four hours after the HEAF event performing tasks like smoke removal. But a continuous fire watch was not posted after they left the area until 1:15 pm on March 19, the day following the HEAF event. And these workers were placed in Switchgear Room 3A, not in Switchgear Room 3B housing the bus that needed to be protected.

Had a fire started in Switchgear Room 3B, neither the installed fire detectors nor the human fire detectors would have alerted control room operators. The lights going out on Broadway, or whatever they call the main avenue at Turkey Point, might have been their first indication.

Complication #3

At 12:30 pm on March 18, workers informed the control room operators that the HEAF event damaged Bus 3A such that it could not be re-energized until repairs were completed. Bus 3A provided power to Reactor Coolant Pump 3A and to other safety equipment like the ventilation fan for the room containing Emergency Diesel Generator (EDG) 3A. Due to the loss of power to the room’s ventilation fan, the operators immediately declared EDG 3A inoperable.

EDGs 3A and 3B are the onsite backup sources of electrical power for safety equipment. When the reactor is operating, the equipment is powered by electricity produced by the main generator as shown by the green line in Figure 2. When the reactor is not operating, electricity from the offsite power grid flows in through transformers and Bus 3A to the equipment as indicated by the blue line in Figure 2. When under-voltage or under-frequency is detected on their respective bus, EDG 3A and 3B will automatically start and connect to the bus to supply electricity for the equipment as shown by the red line in Figure 2.

Fig. 2 (Source: Nuclear Regulatory Commission with colors added by UCS)

Very shortly after the HEAF event, EDG 3A automatically started due to under-voltage on Bus 3A. But protective relays detected a fault on Bus 3A and prevented electrical breakers from closing to connect EDG 3A to Bus 3A. EDG 3A was operating, but disconnected from Bus 3A, when the operators declared it inoperable at 12:30 pm due to loss of the ventilation fan for its room.

But the operators allowed “inoperable” EDG 3A to continue operating until 1:32 pm. Given that (a) its ventilation fan was not functioning, and (b) it was not even connected to Bus 3A, they should not have permitted this inoperable EDG from operating for over an hour.

Complication #4

A few hours before the HEAF event on Unit 3, workers removed High Head Safety Injection (HHSI) pumps 4A and 4B from service for maintenance. The HHSI pumps are designed to transfer makeup water from the Refueling Water Storage Tank (RWST) to the reactor vessel during accidents that drain cooling water from the vessel. Each unit has two HHSI pumps; only one HHSI pump needs to function in order to provide adequate reactor cooling until the pressure inside the reactor vessel drops low enough to permit the Low Head Safety Injection pumps to take over.

On the day before, workers found a small leak from a small test line downstream of the common pipe for the recirculation lines of HHSI Pumps 4A and 4B (circled in orange in Figure 3). The repair work was estimated to take 18 hours. Both pumps had to be isolated in order for workers to repair the leaking section.

Pipes cross-connect the HHSI systems for Units 3 and 4 such that HHSI Pumps 3A and 3B (circled in purple in Figure 3) could supply makeup cooling water to the Unit 4 reactor vessel when HHSI Pumps 4A and 4B were removed from service. The operating license allowed Unit 4 to continue running for up to 72 hours in this configuration.

Fig. 3 (Source: Nuclear Regulatory Commission with colors added by UCS)

Before removing HHSI Pumps 4A and 4B from service, operators took steps to protect HHSI Pumps 3A and 3B by further restricting access to the rooms housing them and posting caution signs at the electrical breakers supplying electricity to these motor-driven pumps.

But operators did not protect Buses 3A and 3B that provide power to HHSI Pumps 3A and 3B respectively. Instead, they authorized work to be performed in Switchgear Room 3A that caused the HEAF event.

The owner uses a computer program to characterize risk of actual and proposed plant operating configurations. Workers can enter components that are broken and/or out of service for maintenance and the program bins the associated risk into one of three color bands: green, yellow, and red in order of increasing risk. With only HHSI Pumps 4A and 4B out of service, the program determined the risk for Units 3 and 4 to be in the green range. After the HEAF event disabled HHSI Pump 3A, the program determined that the risk for Unit 4 increased to nearly the green/yellow threshold while the risk for Unit 3 moved solidly into the red band.

The Cause(s)

On the morning of Saturday, March 18, 2017, workers were wrapping a fire-retardant material called Thermo-Lag around electrical cabling in the room housing Bus 3A. Meshing made from carbon fibers was installed to connect sections of Thermal-Lag around the cabling for a tight fit. To minimize the amount of debris created in the room, workers cut the Thermal-Lag material to the desired lengths at a location outside the room about 15 feet away. But they cut and trimmed the carbon fiber mesh to size inside the room.

Bus 3A is essentially the nuclear-sized equivalent of a home’s breaker panel. Open the panel and one can open a breaker to stop the flow of electricity through that electrical circuit within the house. Bus 3A is a large metal cabinet. The cabinet is made up of many cubicles housing the electrical breakers controlling the supply of electricity to the bus and the flow of electricity to components powered by the bus. Because energized electrical cables and components emit heat, the metal doors of the cubicles often have louvers to let hot air escape.

The louvers also allow dust and small airborne debris (like pieces of carbon fiber) to enter the cubicles. The violence of the HEAF event (a.k.a. the explosion) destroyed some of the evidence at the scene, but carbon fiber pieces were found inside the cubicle where the HEAF occurred.  The carbon fiber was conductive, meaning that it could transport electrical current. Carbon fiber pieces inside the cubicle, according to the NRC, “may have played a significant factor in the resulting bus failure.”

Further evidence inside the cubicle revealed that the bolts for the connection of the “C” phase to the bottom of the panel had been installed backwards. These backwards bolts were the spot where high-energy electrical current flashed over, or arced, to the metal cabinet.

As odd as it seems, installing fire retardant materials intended to lessen the chances that a single fire compromises both electrical safety systems started a fire that compromised both electrical safety systems.

The Precursor Events (and LEAF)

On February 2, 2017, three electrical breakers unexpectedly tripped open while workers were cleaning up after removing and replacing thermal insulation in the new electrical equipment room.

On February 8, 2017, “A loud bang and possible flash were reported to have occurred” in the new electrical equipment room as workers were cutting and installing Thermo-Lag. Two electrical breakers unexpectedly tripped open. The equipment involved used 480 volts or less, making this a low energy arc fault (LEAF) event.

NRC Sanctions

The NRC dispatched a special inspection team to investigate the causes and corrective actions of this HEAF event. The NRC team identified the following apparent violations of regulatory requirements that the agency is processing to determine the associated severity levels of any applicable sanctions:

  • Failure to establish proper fire detection capability in the area following the HEAF event.
  • Failure to properly manage risk by allowing HHSI Pumps 4A and 4B to be removed from service and then allowing work inside the room housing Bus 3A.
  • Failure to implement effective Foreign Material Exclusion measures inside the room housing Bus 3A that enabled conductive particles to enter energized cubicles.
  • Failure to provide adequate design control in that equipment installed inside Cubicle 3AA06 did not conform to vendor drawings or engineering calculations.

UCS Perspective

This event illustrates both the lessons learned and the lessons unlearned from the fire at the Browns Ferry Nuclear Plant in Alabama that happened almost exactly 42 years earlier. The lesson learned was that a single fire could disable primary safety systems and their backups.

The NRC adopted regulations in 1980 intended to lessen the chances that one fire could wreak so much damage. The NRC found in the late 1990s that most of the nation’s nuclear power reactors, including those at Browns Ferry, did not comply with these fire protection regulations. The NRC amended its regulations in 2004 giving plant owners an alternative means for managing the fire hazard risk. Workers were installing fire protection devices at Turkey Point in March 2017 seeking to achieve compliance with the 2004 regulations because the plant never complied with the 1980 regulations.

The unlearned lesson involved sheer and utter failures to take steps after small miscues to prevent a bigger miscue from happening. The fire at Browns Ferry was started by a worker using a lit candle to check for air leaking around sealed wall penetrations. The candle’s flame ignited the highly flammable sealant material. The fire ultimately damaged cables for all the emergency core cooling systems on Unit 1and most of those systems on Unit 2. Candles had routinely been used at Browns Ferry and other nuclear power plants to check for air leaks. Small fires had been started, but had always been extinguished before causing much damage. So, the unsafe and unsound practice was continued until it very nearly caused two reactors to meltdown. Then and only then did the nuclear industry change to a method that did not stick open flames next to highly flammable materials to see if air flow caused the flames to flicker.

Workers at Turkey Point were installing fire retardant materials around cabling. They cut some material in the vicinity of its application. On two occasions in February 2017, small debris caused electrical breakers to trip open unexpectedly. But they continued the unsafe and unsound practice until it caused a fire and explosion the following month that injured a worker and risked putting the reactor into a station blackout event. Then and only then did the plant owner find a better way to cut and install the material. That must have been one of the easiest searches in nuclear history.

The NRC – Ahead of this HEAF Curveball

The NRC and its international regulatory counterparts have been concerned about HEAF events in recent years. During the past two annual Regulatory Information Conferences (RICs), the NRC conducted sessions about fire protection research that covered HEAF. For example, the 2016 RIC included presentations from the Japanese and American regulators about HEAF. These presentations included videos of HEAF events conducted under lab conditions. The 2017 RIC included presentations about HEAF by the German and American regulators. Ironically, the HEAF event at Turkey Point occurred just a few days after the 2017 RIC session.

HEAF events were not fully appreciated when regulations were developed and plants were designed and built. The cooperative international research efforts are defining HEAF events faster than could be accomplished by any country alone. The research is defining factors that affect the chances and consequences of HEAF events. For example, the research indicates that the presence of aluminum, like in cable trays holding the energized electrical cables, can be ignited during a HEAF event, significantly adding to the magnitude and duration of the event.

As HEAF research defined risk factors, the NRC has been working with nuclear industry representatives to better understand the role these factors may play across the US fleet of reactors. For example, the NRC recently obtained a list of aluminum usage around high voltage electrical equipment.

The NRC needs to understand HEAF factors as fully as practical before it can determine if additional measures are needed to manage the risk. The NRC is also collecting information about potential HEAF vulnerabilities. Collectively, these efforts should enable the NRC to identify any nuclear safety problems posed by HEAF events and to implement a triaged plan that resolves the biggest vulnerabilities sooner rather than later.

New World Heritage Sites Already Under Threat From Climate Change

UCS Blog - The Equation (text only) -

At least four of the new World Heritage sites designated by UNESCO at the annual meeting of the World Heritage Committee this week are under serious threat from climate change.

In all, 21 new sites were added to the World Heritage list, and although most are not immediately vulnerable to climate change, probably all are already experiencing local climatic shifts, and most will be significantly impacted within a few decades unless action is taken soon to reduce heat-trapping emissions globally. Climate change is a fast-growing problem for World Heritage and one that the World Heritage Committee needs to take much more seriously than it currently is.

Climate is the biggest global threat to World Heritage

In 2014, the International Union for the Conservation of Nature (IUCN) identified climate change as the biggest potential threat to natural World Heritage sites and a study by the Potsdam Institute for Climate Impact Research and the University of Innsbruck in Austria found 136 of 700 cultural World Heritage sites to be at long-term risk from sea level rise. In 2016, a joint UCS, UNESCO, UNEP report concluded that “climate change is fast becoming one of the most significant risks for World Heritage worldwide”. This year, UNESCO launched two new reports highlighting the dramatic climate threat to coral reefs in World Heritage sites, and to sites in the Arctic.

The World Heritage Committee needs to address climate change

There is a dilemma here. The World Heritage Convention is a remarkable international instrument that was set up to identify and protect both natural and cultural sites of “outstanding universal value” for future generations. However, when the convention was adopted in 1972, the threat of global climate change was nowhere on political or scientific radar screens, and so the mechanisms of the treaty were geared to preventing local impacts such as water pollution, mining & quarrying, infrastructure development and land use change.

The convention hasn’t yet effectively responded to modern climate change risks. If a World Heritage site is threatened by coal mining, tourism pressure or suburbanization, it can be placed on the list of sites in danger, and then the responsibility lies with the host country to implement management actions reducing the threat. But no site has yet been placed on that list because of climate change.

Meanwhile, places at serious risk from climate change are still being added as new World Heritage sites. UCS plans to work with UNESCO’s two primary international non-profit technical advisors, IUCN and ICOMOS (International Council on Monuments and Sites) to address this issue at next year’s World Heritage Committee meeting.

Four newly designated World Heritage sites vulnerable

Here are the four newly designated sites already being impacted by climate change:

Lake District, United Kingdom

The Lake District. Photo: Adam Markham

A spectacular landscape of glaciated valleys and lakes, this region was the cradle of the English Romantic movement led by the poets William Wordsworth and Samuel Taylor Coleridge, and home to the authors Beatrix Potter and John Ruskin. Its agro-pastoral landscape dotted with hill farms and stone walls is the result of hundreds of years of sheep farming, and the Lake District is now one of Britain’s most popular tourism destinations.

Unfortunately, the area is already experiencing warmer, wetter, winters and more intense extreme weather events. Disastrous floods in 2009 washed away old bridges and footpaths, and unprecedented drought in 2010-12 affected water supply and water quality in lakes and rivers. Conservation managers predict that species at the edge of their ranges in the Lake District, including cold-water fish such as the Arctic char, could become locally extinct, peat habitats may dry out, woodland species composition will change and invasive alien species like Japanese knotweed will proliferate in response to changing conditions.

Kujataa, Greenland (Denmark)

Ruined Norse buildings at Kujataa. Photo: UNESCO/Garðar Guðmunds-son

Kujataa in southern Greenland holds archaeological evidence of the earliest introduction of farming to the Arctic by Norse settlers from Iceland, and earlier hunter-gatherers.

Today, it’s an exceptionally well preserved cultural landscape of fields and pastures from medieval times through to the 18th Century, representing a combination of Norse and Inuit subsistence farming and sea mammal hunting. However, in common with the rest of Greenland, the area is experiencing a rapidly warming climate.

Coastal erosion exacerbated by sea level rise and more intense storms can damage historic monuments and archaeology. Elsewhere in Greenland, warming temperatures have been shown to hasten decomposition of organic material at archaeological sites, including wood, fabrics and animal skins – a growing problem throughout the Arctic. Warming at Kujataa is also expected to increase the growth of shrubby vegetation and alter agricultural cycles, potentially necessitating changes in cropping strategies by local farmers.

Landscapes of Dauria, Mongolia & Russian Federation

Daurien steppe wetlands. Photo: UNESCO/O.Kirilyu

This new transboundary World Heritage site covers a huge area of undisturbed steppe, and is a globally important ecoregion. Home to nomadic Mongolian herders who have used the grasslands for over 3,000 years, the Daurian steppes are also rich in biodiversity. They are important for millions of migratory birds and home to almost all the world’s Mongolian gazelle population as well as threatened species such as the red-crowned crane and swan goose.

According to a climate impacts assessment by IUCN, the mean annual temperature of the region has already risen by 2°C and further climate change is expected to bring longer and more severe droughts, reducing grassland productivity and changing wetlands dramatically in what is already a landscape of cyclical weather extremes. Desertification and wildfires worsened by climate change are adding further environmental pressures.

‡Khomani Cultural Landscape, Republic of South Africa

‡Khomani San cultural heritage has at last been recognized. Photo: UNESCO/Francois Odendaal Productions

The ‡Khomani San (or Kalahari bushmen) are the indigenous first people of the Kalahari Desert, but they were forced from their land when the Kalahari Gemsbok National Park (now part of the  Kgalagadi Transfrontier Park) was created in 1931. The displacement led to dispersion of the ‡Khomani San people through South Africa, Namibia and Botswana and almost killed off many traditional cultural practices as well as ancient languages such as N|u.

After apartheid ended, the San were successful in settling a land claim and the new World Heritage site, which coincides with the boundaries of the national park, recognizes their thousands of years of traditional use of this land, their close connection to its natural systems and their right to co-manage the preserve.

Unfortunately, climate change presents a new challenge. The Intergovernmental Panel on Climate Change (IPCC) has projected accelerated warming and a drying trend for this area of southern Africa, and in recent decades conversion of grassland into savanna, with more un-vegetated soil has been reported and increased desertification is a growing threat. Kalahari Gemsbok National Park is the fastest warming park in South Africa and scientists have recorded a rise in mean maximum annual temperature there of nearly 2°C since 1960.

 

President Trump’s Budget Leaves Workers Behind

UCS Blog - The Equation (text only) -

Budgets reflect priorities; they also reflect values. And the Trump Administration has signaled where it stands loud and clear via its agency appointments (Scott Pruitt, need we say more?) and its FY18 budget proposals. We have already said plenty about what the proposed cuts to the EPA budget mean for public health and the environment.

A recap here, here, here, here. Many others are also ringing that alarm bell (here, here, here).

Less in the public eye is the Administration’s budget proposals for agencies that protect another critical resource—our nation’s workforce! We do have some indication of where Congress and the Administration stand on worker health and safety (here, here)—and it’s not reassuring.

Trump budget puts worker health on chopping block

Let’s cut to the chase. President Trump’s FY18 budget proposals are not good for working people; these are our loved ones, our families’ breadwinners. They are also essential contributors to powering our economy…you know, making America great.

Here’s a quick snapshot of the cuts our President has proposed for our primary worker health and safety agencies—the agencies that safeguard and protect our nation’s workforce:

  • Occupational Safety and Health Administration (OSHA). $9.5 million budget cut; staffing cuts in enforcement program; elimination of safety and health training grants for workers. OSHA was created by Congress to “assure safe and healthful working conditions for working men and women.” It is our nation’s bulwark in protecting workers by setting and enforcing standards and providing training, outreach, education and assistance to employers and workers. At current budget levels, OSHA can only inspect every workplace in the United States once every 159 years.
  • National Institute for Occupational Safety and Health (NIOSH). An astounding 40% budget cut. NIOSH is our nation’s primary federal agency responsible for conducting research, transferring that knowledge to employers and workers, and making recommendations for the prevention of work-related illness and injury. These draconian cuts will essentially eliminate academic programs that train occupational health and safety professionals (occupational medicine physicians and nurses, industrial hygienists, workplace safety specialists) that serve both employers and workers. It will eliminate extramural research programs that conduct, translate, or evaluate research, as well as state surveillance programs for occupational lead poisoning, silicosis, and other diseases.
  • Mine Safety and Health Administration (MSHA). $3 million cut to the agency’s budget on top of previous $8 million cut. This will reduce the number of safety inspection in U.S. coal mines by nearly 25%. MSHA was established in 1977 to prevent death, illness, and injury from mining and to promote safe and healthful workplaces for U.S. miners. (The first federal mine safety statute was passed in 1891.)
Some context

My reflections on this year’s Worker Memorial Day pretty much capture it. But here’s a quick summary:

  • In 2015, 4,836 U.S. workers died from work-related injuries, the highest number since 2008. That’s about 13 people every day! In the United States!
  • Deaths from work-related occupational disease—like silicosis, coal workers’ pneumoconiosis (black lung), occupational cancer, etc.—are not well captured in data surveillance systems. It is estimated that another 50,000-60,000 died from occupational diseases—an astounding number. And, for many, their deaths come years after suffering debilitating and painful symptoms.
  • And then there are the nonfatal injuries and illnesses. Employers reported approximately 2.9 million of them in private industry workers in 2015; another 752,600 injury and illness cases were reported among the approximately 18.4 million state and local government workers.
  • There were nine fatalities and 1,260 reportable cases of workplace injury in the US coal mining industry in 2016.
Speak out opportunity this week

The House subcommittee on Labor–HHS Appropriations has scheduled the markup on the FY 2018 Labor–HHS funding bill for Thursday, July 13, 2017. This is the bill that funds OSHA, MSHA, and NIOSH, as well as the National Institute for Environmental Health Sciences (NIEHS) and the National Labor Relations Board (NLRB). Now is the time to give the appropriators an earful on these proposed cuts—cuts that seriously endanger workers’ safety and health, essentially leaving them behind. Reach out to members of the House appropriation subcommittee and committee and urge them to oppose these cuts to our worker health and safety agencies. Also urge them to oppose any “poison pill riders” to block or delay the implementation of worker protection rules.

Here’s a list of members of the Labor–HHS subcommittee. Members of the full Appropriations Committee are listed here.

How the Senate Healthcare Bill Bolsters the Tanning Industry’s Misinformation Campaign

UCS Blog - The Equation (text only) -

The American Suntanning Association (ASA) and the Indoor Tanning Association (ITA) are trade organizations representing the interests of indoor tanning manufacturers, suppliers and salon owners. The product that these trade organizations sell to customers is artificial UV radiation. The ASA has called itself a “science-first organization” and spouts off so-called scientific information on their website, TanningTruth.com, designed to correct “misinformation” about the harms of indoor sun tanning.

One problem, though: the science doesn’t support their position. In May 2013, several scientific researchers wrote in JAMA Dermatology about the ASA’s biased scientific agenda and how its unscientific messages can negatively impact the public: “Clinicians should be aware of this new counter-information campaign by the [indoor tanning] industry and continue to inform their patients about the risks of [indoor tanning] and the existence of potentially misleading information from the ASA and other organizations. Scientists and clinicians have a duty to remain cognizant of such issues and to voice concerns when agenda-based research is presented in order to “defend and promote” a product with potentially devastating health consequences.”

Like the tobacco industry, sugar industry, and fossil fuel industries before them, the indoor tanning industry is refusing to accept its ethical responsibility to inform customers of the harms of its products. Instead it is actively working to create uncertainty around the science and question the integrity of the governmental and scientific institutions that are acting in the public’s best interest.

The indoor tanning industry seeks to become “great again”

As President Trump took office, the ASA began the derivative slogan, “Make Indoor Tanning Great Again” in the industry’s monthly magazine, SmartTan. By “great again,” the industry means repealing the Affordable Care Act’s (ACA) inclusion of a 10% tax on indoor tanning services. In its advertisement, the ASA urges readers to join the movement and add to its effort of “building relationships with key policymakers and educating the federal government about our industry’s scientifically supported position.” In the June issue of SmartTan, in an article titled “Axing the Tan Tax,” the author brags that three full-time ASA lobbyists over the course of four years have convinced Congressional leadership (including current HHS chief Tom Price and Vice President Mike Pence) that the American tanning industry has been treated unfairly and that there is science supporting the benefits of non-burning UV exposure.

I’ll let The American Academy of Dermatology (AAD) take this one.

The AAD recommends that individuals should obtain vitamin D from a healthy diet, not from unprotected exposure to ultraviolet radiation, because there is “not scientifically validated, safe threshold of UV exposure from the sun or indoor tanning devices that allows for maximal vitamin D synthesis without increasing skin cancer risk.” Anyway, even if there were benefits of UV exposure, the business model of these salons operates upon the retention of customers throughout the year, not just during the busy season. And more trips to the salon means an increased risk of burns and unsafe exposure.

The tanning tax has a twofold goal: help reduce skin cancer risk, especially in young adults, and to use funds to help pay for implementation of the Act. The science on the association between indoor tanning and increased risk of skin cancer supported the case for the inclusion of this policy measure in President Obama’s healthcare bill. A quick spotlight on that science can be summed up by the headline on the Centers for Disease Control (CDC) website addressing the topic: “indoor tanning is not safe.”

The Surgeon General issued a call to action to prevent skin cancer in 2014 which warned of the risks of indoor tanning. The International Agency for Research on Cancer (IARC) has said this of tanning bed related exposure: “citing evidence from years of international research on the relationship between indoor tanning and skin cancer, the IARC, affiliated with the World Health Organization, placed this type of UVR in its most dangerous cancer category for humans, alongside offenders such as radon, asbestos, cigarettes, plutonium, and solar UVR.” Incidence of one of the most dangerous types of skin cancer, melanoma, has been rising over the past 30 years. And, people who started tanning before age 35 have an increased risk of developing melanoma. Because indoor tanning is popular among adolescents and it’s a risky behavior to start while young, restriction of indoor tanning would help public health outcomes.

The FDA has proposed a rule restricting indoor tanning bed use to adults over 18 and requiring users to sign a risk acknowledgement certification stating that they understand the risks of indoor tanning. Taxes are also an effective way to curb demand for a substance, the use of which has worked to decrease cigarette smoking and more recently, sugar-sweetened beverage consumption. However, the tanning industry has joined the ranks of the tobacco and sugar industries to fuel a misinformation campaign designed to sow doubt about the body of science showing harm and delay or quash policies that hurt their bottom line.

Shining a light on the tanning industry’s misinformation campaign

The tanning industry has long fought any regulation or control of its messaging and has funneled money to members of Congress to help in these efforts.

Burt Bonn, immediate past president of ASA, told Smart Tan magazine in February 2017 that “[t]he science has been in our favor from the very beginning…Our opponents have relied on just a few out of hundreds of studies on the risks of UV light to make their case. Nearly all of those studies have been debunked.” He continued, “I think the science is already at a point that it ought to be embarrassing to have someone in the medical profession advise complete and total sunlight abstinence or suggest that a tanning bed operated in a professional tanning salon is a major issue.” Embarrassing? Tell that to the American Academy of Dermatologists, the American Academy of Pediatrics, the American Medical Association, representing thousands of experts in their fields who have recommended that all adults reduce UV exposure, and that children under 18 eliminate UV exposure altogether.

The tanning industry has been on fire in its repeated attempts to misrepresent the science on UV exposure in multiple venues. In 2010, the Federal Trade Commission (FTC) charged that the ITA made misleading representations in its advertising and marketing for indoor tanning including falsely claiming that indoor tanning poses no risk to health, including no risk of skin cancer. The 2010 administrative order from the FTC prohibits the ITA from making representations that imply that the indoor tanning does not increase the risk of skin cancer. In March of 2017, the FTC wrote to the ITA informing them that its FAQ page on its website that claimed that “indoor tanning [was]more responsible than outdoor tanning” and that “melanoma was not associated with UV exposure from tanning bed[s]” were not allowed.

Apparently, the failure to remove that language since 2010 was an oversight by the ITA, but those non-scientific bits of information had persisted on their website, and had been picked up and used on third-party websites, for years! Not only are the indoor tanning industry trade associations still distributing their own unscientific materials to convince users of their products’ safety, but they are using these same arguments to attempt to meddle with public messaging and federal policy at the CDC and FDA and in its lobbying for tanning tax relief since 2011.

According to the tanning industry’s January 2015 issue of Smart Tan, the ASA’s legal and lobbying teams succeeded in getting the CDC to remove claims of a 75% increase in melanoma risk from sunbed use from its website, and “the ASA legal team is following appropriate CDC protocol to challenge even more language that we believe is not supported by sound science.” ASA’s Burt Bonn told the industry magazine that the previous leadership of the CDC and the Surgeon General were unwilling to consider the appropriate scientific evidence regarding indoor tanning and seems hopeful that the new directors will be more “open-minded.” He continued, “The federal government is currently treating tanning like smoking, when there’s no science to support that ridiculous comparison. The consumer advocacy campaigns need to stop…The science is overwhelmingly supportive of sunlight and human health, and we currently have an administration that seems to be driven more by politics than current science. But now they are gone and change is coming.” It should not surprise us that the Administration that coined the moronic oxymoron “alternative facts” would be supportive of an industry wed to the abusive use of such things.

The ASA and ITA’s 2016 comment to the FDA opposed its proposed rule that would restrict the sale, distribution, and use of indoor tanning beds to minors on the basis that the science that the agency relied upon is outdated and doesn’t support the “onerous requirements” of the FDA’s rule.

Section 118 of the Better Care Reconciliation Act of 2017 would repeal the 10% excise tax on indoor tanning services established by the 2010 Affordable Care Act. According to the Joint Committee on Taxation, its repeal would reduce revenues by approximately $622 over ten years and would drastically reduce funding to implement the Affordable Care Act.

There have also been several failed Congressional attempts to repeal the ACA tanning tax, lobbied for by none other than the ITA. In 2015 and 2017, Representative George Holding introduced a bill to repeal the tanning tax after receiving over $6,000 from the Indoor Tanning Association. Representative Michael Grimm had introduced a similar bill in 2011 and 2014 and received over $8,000 in 2012.  And now, the indoor tanning industry is cheering the introduction of the provision within the draft Senate healthcare bill, the Better Care Reconciliation Act of 2017, that would repeal the tax.

The truth about indoor tanning risks must inform federal policy

In spite of the tanning industry’s best efforts so far, the government’s messaging on the risks of indoor tanning and the policy measures instituted to reduce its use are working. CDC data from the High School Youth Risk Behavior Surveillance System found that the number of high school students who used an indoor tanning device decreased by more than half since the ACA and the tanning tax went into effect, from 15.6% in 2009 to 7.3% in 2015. In light of the great body of scientific evidence demonstrating the risks of UV exposure, we should be doing even more to educate and protect teenagers and adults alike from the harms of UV exposure, rather than rolling back important policies aimed at accomplishing just that.

The Senate healthcare bill ignores the science on health impacts of indoor tanning and caves to the tanning industry and their misinformation campaign. It is currently opposed by nearly all scientific voices that work in healthcare. The American Academy of Dermatology opposes the repeal of the tanning tax provision, and the full bill is opposed by the American Medical Association, the American Hospital Association, the Federation of American Hospitals, the American Academy of Pediatrics, the American College of Physicians, the American Academy of Family Physicians, the Association of American Medical Colleges, and the list goes on.

We have the right to make decisions based on accurate scientific information, not the information cherry-picked and screened by an industry who stands to profit from our ignorance. We should put the Indoor Tanning Association and the American Suntanning Association in the hot seat (and not the kind you find in a tanning salon) for perpetuating falsehoods about their products that could harm consumers’ health. This goes for the current draft Senate healthcare bill and any other future measures that would limit the amount of information available to consumers about the risks of indoor tanning or otherwise compromise policy solutions aimed at keeping us safe.

Photo: Marco Vertch/CC BY 2.0 (Wikimedia)

Historic Treaty Makes Nuclear Weapons Illegal

UCS Blog - All Things Nuclear (text only) -

Remember this day, July 7, 2017. Today, history was made at the United Nations and the nuclear status quo was put on notice and most of the world stood up and said simply, “Enough.”

(Source: United Nations)

Just hours ago, 122 nations and a dedicated group of global campaigners successfully adopted a legally binding international treaty prohibiting nuclear weapons and making it illegal “to develop, test, produce, manufacture, otherwise acquire, possess or stockpile nuclear weapons or other nuclear explosive devices.” Nuclear weapons now join biological and chemical weapons, land mines and cluster munitions that are now explicitly and completely banned under international law.

Our heartfelt gratitude to all who worked tirelessly to make this moment possible, including the International Campaign to Abolish Nuclear Weapons (ICAN), Reaching Critical Will, the governments of Norway, Mexico and Austria (which hosted three international conferences on the Humanitarian Consequences of Nuclear Weapons which inspired this effort) and so many other nations, civil society organizations, scientists, doctors & other public health professionals and global citizens/activists.

This is a powerful expression of conscience and principle on behalf of humanity from 63 percent of the 193 UN member states—one anchored in the simple truth that nuclear weapons are illegitimate instruments of security. ICAN lays out the imperative quite well:

“Nuclear weapons are the most destructive, inhumane and indiscriminate weapons ever created. Both in the scale of the devastation they cause, and in their uniquely persistent, spreading, genetically damaging radioactive fallout, they are unlike any other weapons. They are a threat to human survival.”

Challenging the status quo is at the heart of most successful mass movements for social and planetary progress. Those who benefit from the status quo never give up easily. Movements to end slavery, give women the right to vote, establish marriage equality in the United States and other examples of momentous social change were first bitterly opposed and derided by opponents as naïve, wrong, out of touch, costly, unachievable, etc.

Nuclear weapons are no different. The United States, Russia and other nuclear-armed and nuclear “umbrella” states chose not to participate in these ban treaty negotiations, and dismissed it outright. Indeed, senior officials in the Obama administration spent years doing verbal gymnastics to align the rhetoric of the president who stood up in Prague pledging to work toward “the peace and security of a world free of nuclear weapons” with outright hostility to the ban treaty. To no one’s surprise, the Trump administration has embraced the Obama administration’s plans to perpetuate the nuclear status quo and has forfeited any role or leadership in this critical discussion.

And don’t even get me started about all of the Washington insiders who believe nuclear deterrence will never fail and we can rely on the sound judgement of a small number of people (most of them men) to prevent global nuclear catastrophe.

The ban treaty effort is meant to provide renewed energy and momentum to the moribund global nuclear disarmament process. It is intended to be a prod to the nuclear-armed signatories of the Nuclear Non-Proliferation Treaty (NPT), which have largely ignored their obligation to pursue nuclear disarmament. It will help revive the NPT and the UN’s Conference on Disarmament, not replace them.

Indeed, most of the world has run out of patience and today they spoke loudly. The treaty will be open for signature in September and one can only hope that this is a true turning point in our effort to save humanity from these most horrible of all weapons.

Tesla Model 3 vs. Chevy Bolt? What You Need to Know Before Buying an Electric Car

UCS Blog - The Equation (text only) -

It’s 90 degrees here in our nation’s capital but it might feel like the winter holiday season to those who reserved a Tesla Model 3. Expected to have a 215-mile range and sticker price of $35,000 (or $27,500 after the federal tax credit), the Model 3 will compete with the similar spec’d Chevy Bolt for the prize of cornering the early majority of electric vehicle owners.

No other automaker has a relatively affordable, 200 mile-plus range electric vehicle on the market, yet (the nextgen Nissan Leaf will compete too), and one or both of these vehicles may be a pivotal point in the modern shift to electrics.Assuming you’re already sold on the benefits of driving on electricity, here are a couple tips for you to consider if you’re prepping for an electric vehicle.

#1 Prepare your home charging

There are two main options for charging an electric vehicle at home: (1) 120V charging from an ordinary home outlet and (2) 240V charging from either an upgraded home circuit or existing circuit for a heavy electric appliance like a drying machine.

There is also DC fast charging, but that is only applicable to charging on-the-go and described in more detail below. Before deciding on how to charge, talk with a couple licensed electricians to better understand your home’s electrical capacity. Mr. Electric appears to win the Google SEO for “electrician for electric vehicle,” so maybe head there for a start.

Electric Vehicle Charging Level 1 (120 volts) – about 4-6 miles of range per hour of charge

  • Uses an ordinary wall outlet just like a toaster.
  • Typically won’t require modifications to electric panels or home wiring.
  • Confirm that your home’s electrical circuits are at least 15 or 20-amp, single pole by consulting with a licensed electrician.
  • Slow, but can get the job done if you don’t drive that much on a daily basis. If you only need 20 miles of range, for example, only getting 20 miles of charge each night is not a problem. For road trips, most EVs are equipped to handle the faster charging options that can make charging pit stops on road trips pretty quick.

Electric Vehicle ChargingLevel 2 (240 volts) – about 10-25 miles of range per hour of charge

  • Installation costs vary, but here’s a 30-amp charger from Amazon that is highly rated and costs around $900, including installation, and here’s one that includes an algorithm to minimize charging emissions and costs.
  • Will likely require a new dedicated circuit from the electric panel to a wall location near the EV parking spot.
  • Consult with a licensed electrician to verify that your home has a two-pole 30 to 50-amp electrical circuit breaker panel.

Electric Vehicle Charging Level 3 (aka DC fast charging) (400 volts) – Not for home use, but can charge battery up to 80 percent in about 30 minutes

  • The fastest charging method available, but prohibitively expensive for home use.
  • Some vehicles can get an 80 percent full charge in as little as 30 minutes, depending on the electric vehicle type.
#2 File your tax credit(s)

Purchasing an electric vehicle should qualify you for a federal tax credit of up to $7,500. Here is all the information and form to fill out when you file taxes. You better file quick because the federal tax credit is capped at 200,000 credits per manufacturer. Some manufacturers, including Nissan and Chevrolet, are forecast to hit the 200,000 cap as early as 2018. If Tesla delivers on its 400,000 Model 3 pre-orders, not every Model 3 owner will be able to take advantage of the full $7,500 savings, so act fast!

Also check this map to see what additional state incentives you may qualify for.

#3 Locate public charging stations

Tesla has a network of fast charging stations exclusively for Tesla owners, but there are thousands of public charging stations that any electric vehicle driver can use on the go too. You may be surprised to find chargers near your workplace, school, or other frequent destination. Check out this Department of Energy station locator, or this map from PlugShare. The Department of Transportation has also designated several charging corridors that should be getting even more EV chargers.

#4 Contact your utility

Give your utility a heads up that you are getting an electric vehicle, and inquire about any promotional plans for vehicle charging. Some utilities have flexible “time-of-use” rates, meaning that they will charge you less when you plug a vehicle in during off-peak times (typically overnight). Your utility might also have its own electric vehicle incentives, like a rebate on installation or charger costs, or even a pilot project on smart charging where you can get paid to plug in your vehicle.

#5 Say goodbye to internal combustion engines, forever!

Driving on electricity is not only cheaper and cleaner than driving on gasoline, it’s also a total blast. Prepare to never want to go back to gasoline-powered vehicles as you cruise on the smooth, silent power of electricity.

How Trump’s Trade Talks (and Tweets) Got Sickeningly Sweet

UCS Blog - The Equation (text only) -

There are things that raise eyebrows in the public health community.

One of those things is when the sugar industry is happy.

While they’ve had a lot to smile about lately—including the delay of an FDA rule requiring added sugar to be listed on packaged food nutrition labels—most recently, it’s President Trump’s trade talks with Mexico.

The preliminary agreement struck between the US and Mexico last month, seen as a prelude to NAFTA renegotiations, effectively reduces the amount of refined sugar Mexico can export to the United States, allowing US refineries to remain competitive. In return, Mexico has the option to supply any excess demand for refined sugar in US markets. (Refined sugar is the white, granulated stuff most of us know as sugar; it’s made from raw sugar harvested from sugarcane or beets.)

The US has long had import quotas and other mechanisms securing price guarantees for refined sugar, but producers argued that these were undermined when Mexico was allowed unrestricted access to the American market in 2008 through what was called a loophole in NAFTA—leading to the dumping of subsidized refined sugar into the US market. Following claims of unfair trading practices filed by American producers in 2014, Mexico agreed to limit the price and volume of its sugar exports, but US sugar producers still felt trade laws had been violated. Had the US and Mexico not reached an accord last month, Mexican companies may have been subject to financial penalty as a result.

I’m not here to comb through the finer points of these negotiations, because I’m not qualified to do so. Nor will I propose that we ban sugar and hope the industry folds, because I plan on having ice cream this weekend. As long as Americans produce, process, and consume sugar, we will need to negotiate trade in sugar.

That said, I’m concerned by this banner emblazoned this week on the home page of the American Sugar Alliance, which represents sugar producers and refiners.

Photo: sugaralliance.org

Let’s talk about money and corporate influence.

Of course, it’s misleading to suggest there was no sugar deal for many years. As Daniel Pearson, chairman of the U.S. International Trade Commission under former president George W. Bush, noted, “Prior to 2008 and after 2014, it was a very tightly controlled market.

But that aside, I’d like to focus on the power and political influence of the sugar industry. The mission of the American Sugar Alliance, per their website, is to “ensure that sugar farmers and workers in the US sugar industry survive in a world of heavily subsidized sugar.” To aid in their quest for survival, they’ve managed to pull enough quarters out of the couch to spend no less than $2.17 million annually on lobbying between 2013 and 2016. These expenditures are second only to those of American Crystal Sugar, whose total spending topped $3.24 million in 2016. These two groups claim the number one and two spots for highest lobbying expenditures in the category of “crop production and basic processing,” which includes sugar, fruit, vegetables, cotton, grains, soybean, honey, rice, and peanuts.

In the case of this particular deal, a spotlight also shines on the cozy relationship between billionaire Wilbur Ross, Trump’s commerce secretary, and Jose Fanjul, longtime Republican donor and part owner of a sugar and real estate conglomerate that includes the Domino Sugar and Florida Crystals brands. Though the two have never conducted official business together, they’ve run in the same social circles for years and have frequently been guests in each other’s homes. (Hilary Geary Ross describes their “gloriously perfect” vacation in the Dominican Republic here.) At a July fundraiser for then-candidate Trump in Mr. Ross’s Long Island home, Mr. Fanjul took a seat on the exclusive “host committee.” It was later reported that he made donations to the Republican party and the Trump campaign in amounts of $94,600 and $5,400, respectively.

To be clear: I’m not claiming that the sugar industry and its kingpins are the only ones with deep pockets, that they are solely responsible for the outcomes of the trade deal between US and Mexico, or even that this is an inherently disastrous agreement. Nor do I believe that our government agencies are solely a vehicle for the interests of big business; on the contrary, I’m inclined to believe that most federal employees and political appointees lead and serve with integrity.

But we now have a president who has demonstrated an unyielding interest in dismantling and renegotiating fundamental food and agriculture policy—trade and otherwise—with enormous financial implications for some of the biggest and most powerful players in our food system, and we need to pay attention. We’re talking about an industry that has manipulated and influenced nutrition research for over fifty years to shift the onus of disease from sugar to fat. Not to mention agribusiness, which boasts lobbying expenditures on par with defense—totaling over $2 trillion over the last 20 years. These industries are deeply invested in making sure these policies work in their favor, and where there’s a will, a whole lot of capital, and a few friends in high places, it isn’t far-fetched to imagine that there is a way.

Undue industry influence on President Trump’s policy agenda will almost certainly allow the corporate consolidation of our food system to continue to snowball, and in doing so, will move us further than ever from achieving an equitable, sustainable, and health-promoting food system. History has provided us with too many examples of the self-interest of sugar, Big Ag and the food industry to believe otherwise.

We all lose when industry drives policy.

It’s simple.

Industry influence and industry money will benefit industry.

It will not improve the health or wellbeing of our children.

It will not contribute to our communities, our farms, or our local economy.

And it can go unchecked without the awareness and intentional engagement of those of us who stand to lose the most. This sugar deal is done, but there are other battles to fight. At UCS, we’ll continue in our efforts to stand up for science and the public good, from defending climate science and clean air protections at the EPA to fighting the nomination of a USDA Chief Scientist with no scientific background. Because if corporate interests and anti-science ideology drive public policies, we all lose.

Timing, Pollinators, and the Impact of Climate Change

UCS Blog - The Equation (text only) -

Sweetshrub (Calycanthus floridus). These flowers have a scent similar to overripe rotting fruit, and are visited by sap beetles

Periodically in the spring, I have the pleasure of teaching Plant Taxonomy to students at a small college in Asheville, North Carolina. Among other things, I love the way that teaching this class forces me to pay close attention to what is coming out of the ground, leafing out, or flowering at any particular point of the season in the Blue Ridge Mountains where our campus is nestled. Each week, I fill the classroom with clippings from plants for my students to examine, up close and personal, as they learn to recognize different families of plants and how they compare with one another: how trilliums differ from jack-in-the-pulpits, or spring beauty differ from rue anemone.

But a couple weeks into the semester this spring, it became abundantly clear that I was going to need to scrap my syllabus and completely rearrange my labs. A very warm and short winter followed by an early spring meant that many of the plants I depend on appeared to be blooming weeks earlier than usual. While I initially doubted my intuition, based solely on passing observations, I then pulled out my collection notes for lab on March 6 and found it was dated April 6, 2013. My intuition was right on target. The flowering period was three to four weeks earlier than when I last taught the class, just four years ago.

In my research, too, the early spring was evident and influential. I study pollination and floral biology in sweetshrub, Calycanthus floridus, which has wine-red-colored flowers with the scent of overripe, rotting fruit that attracts their pollinators, little sap beetles that crawl into the flowers and feed there. I’ve been following the timing of flowering and fruiting in this plant since 2007, and the data so far show that in years with an early, warm spring, the plant flowers earlier…and the beetles are nowhere to be found. The flowers are there in their glory, flooding the area with their intoxicating sweet aroma, but they are holding a party with no guests—and this does not bode well for their future. The plants depend on the beetles for pollination and subsequent seed production, and in years when the beetles don’t visit, their reproductive success drops to almost nothing.

Author (Amy Boyd) teaching pollination biology to students in the Blue Ridge Mountains of North Carolina.

Phenology and climate change

Timing of biological events—such as flowering and leaf-out in plants or egg-laying in insects—is called phenology, and increasing attention has been given to the study of phenology as we face a changing climate. Many organisms depend on climatic signals such as temperature as cues for their timing during the season, and so as the planet warms, their response to these cues will cause them to leaf out, bloom, mate or lay eggs earlier.

But here’s the rub: many organisms, like the sweetshrub, depend on relationships with other species…and not all species use the same cues. One may use mean daily temperature as its phenological cue while another uses day length. If two species that depend on their interaction with one another use different cues in a changing environment, or respond differently to similar cues, they may end up missing each other entirely—what is likely happening with the beetles and the sweetshrub.

Plant-pollinator mismatch

Scientists keeping watch over phenology are accumulating more and more evidence that our changing climate is affecting many diverse species and potentially disrupting the interactions among them. For example, a study of bumblebees and the plants they visit in the Rocky Mountains has found that the timing of both has shifted earlier, but not by the same amount. The shift in flowering has been greater than the shift in bumblebee timing, resulting in decreased synchrony—and both plants and pollinators may suffer as a result. In Japan, biologists have followed a spring wildflower (Corydalis ambigua) and its bumblebee pollinators and similarly found that the plants were more sensitive than the bumblebees to early onset of spring. Reduced synchrony of bees and flowers resulted in lower availability of pollinators for the plants, and potentially also lower availability of food for the pollinators.

As the planet warms, plants and pollinators alike may adjust to the changes in different ways, leading to mismatches between these symbiotic partners. This impact of climate change on phenology compounds all the other challenges facing pollinators today, like the loss and fragmentation of habitat, disease, pesticide use, and the spread of invasive species.

Maypop (Passiflora incarnata) flower being visited by carpenter bee pollinator (Xylocopa virginica)

Consequences for agriculture

So why should we care about such disruptions in phenology? Being forced to scrap my syllabus is a very minor consequence compared to the potential impacts on agricultural production. By some estimates, 35% of all crop species worldwide depend on or benefit from pollination by animals (including bees and other insects). Some 16% of all vertebrate pollinator species (such as hummingbirds and bats) are threatened with extinction, while at least 9% of all insect pollinators are threatened as well. Pollinators are essential partners with farmers who grow fruit, vegetables and nuts; without them, our own species faces loss of an important component of its food source. Similar mismatches may also change and disrupt relationships between crop plants and pest species, creating new challenges to agriculture or enhancing existing threats.

Farmers see the changes in phenology in their own fields, and they are already concerned about the future of agriculture in a changing climate. But we all need to be aware of the impact of climate change on the web of interactions that make up the world around us, so that we can support lawmakers and others who are ready to stop the human activities impacting our planet’s climate. Many biologists are out there watching, accumulating evidence with the systematic eye of science.  We must support their efforts—and listen to their messages about our impacts on the planet and our future.

 

Amy E. Boyd is Professor of Biology and Chair of the Division of Natural Sciences at Warren Wilson College in Asheville, North Carolina. She is an ecologist and evolutionary biologist whose research currently focuses on plant-pollinator interactions and phenological patterns.

 Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

100% Clean Energy? In California, SB 100 May Make it Possible

UCS Blog - The Equation (text only) -

For many, summertime means getting to wear shorts, eating more ice cream than usual, and if you’re lucky, sleeping in. But for me, summertime means putting on a suit and heading to Sacramento to talk about energy policy. While the Trump Administration tries unsucessfully to convince the country that coal is the answer, the California Legislature is moving ever forward to advance a cleaner and healthier energy future

Right now, much attention is focused on the California Clean Energy Act of 2017 (“SB 100” for short). SB 100 would accelerate the state’s primary renewable energy program—the Renewables Portfolio Standard (RPS)—which was created to reduce our reliance on fossil fuels and improve air quality. The RPS currently requires every utility in the state to source 50% of its electricity sales from renewables by 2030.

The program has been a major driver of renewable energy development since its inception in 2002, and has helped us significantly reduce greenhouse gas emissions and criteria air pollution associated with electricity generation. In 2016, California was generating 27% of its electricity from RPS-eligible renewables like solar, wind, geothermal, bioenergy, and small hydropower.

California renewable energy mix in 2016. Source: California Energy Commission

SB 100 would accelerate the 50% RPS requirement to 2025 and establish a new target of 60% by 2030. Getting to 60% renewables by 2030 is certainly achievable. Many of the major electricity providers in the state are already on track to meet or exceed the 50% RPS; raising it to 60% by 2030 will help take advantage of the renewable energy federal tax credits that are set to expire by the end of 2019.

SB 100 would also establish a path to decarbonize the remaining electricity used in California (aka the 40% not subject to the RPS).  It does this by directing the state’s energy agencies to study and plan for a electricity grid that utilizes 100% “zero-carbon” resources by 2045.

In other words, 60% of California’s electricity would be generated by RPS-eligible renewables while the remaining 40% would be generated by additional renewables or other types of electricity generation that don’t qualify under the RPS, but also don’t require the combustion of fossil fuels. For example, California’s existing fleet of large hydropower facilities is not RPS-eligible, but would count as “zero carbon.”

Powering the most populous and prosperous state in the country on 100% carbon-free electricity is bold and aspirational, but also achievable. Technology is already available to help the grid to run on very large quantities of renewables, and the cost of investments needed to make this happen are coming down.

One of the biggest challenges we must overcome to reach a zero-carbon electricity future is eliminating our dependence on natural gas to provide energy and grid reliability services. Natural gas-fired generation still makes up 36 percent of California’s electricity mix and emits greenhouse gases and air pollutants.

To jump-start the research effort to securely ease us off fossil fuels, our electricity providers and energy regulatory agencies need a signal from the legislature that Californians demand a carbon-free future.

We have a lot at stake. As climate change intensifies, peoples’ health and economic stability are being threatened by extreme heat, water shortages, forest fires, and sea level rise. Showing the world how to run a grid on 100% carbon-free generation  would provide a blueprint for significant cuts in global warming emissions. In addition, California continues to have the worst air quality of any state in the country; by electrifying the transportation sector with carbon-free electricity, we can cut the largest source of toxic air pollution in the state—cars and trucks.

California has firmly established itself as a clean energy leader by reaching for goals that at first seemed unattainable. At first glance, a 100% carbon-free electricity goal may seem like a moonshot. But I say let’s do it.  We won’t know how close the stars actually are unless we reach for them.

Reentry Heating from North Korea’s July 4 Missile Test

UCS Blog - All Things Nuclear (text only) -

In a previous post, I estimated what North Korea could have learned from its May 14 Hwasong-12 missile test that is relevant to developing a reentry vehicle (RV) for a longer range missile.

I’ve updated the numbers in that post for the July 4 missile test (Table 1). In particular, I compare several measures of the heating experienced by the RV on the July 4 test to what would be experienced by the same RV on a 10,000 km-range missile on a standard trajectory (MET).

Table 1. A comparison of RV heating on the July 4 test and on a 10,000 km-range trajectory, assuming both missiles have the same RV and payload. A discussion of these quantities can be found in the earlier post.

The numbers in Table 1 are very nearly the same as those for the May 14 test, which means this test would give only a marginal amount of new information.

The maximum heating rate (q) would be essentially the same for the two trajectories. However, the total heat absorbed (Q) by the 10,000 km missile would be 60% larger and the duration of heating (τ) would be more than two and a half times as long.

In its statement after the July 4 test, North Korea said:

the inner temperature of the warhead tip was maintained at 25 to 45 degrees centigrade despite the harsh atmospheric reentry conditions of having to face the heat reaching thousands of degrees centigrade

While this may be true, the additional heat that would be absorbed on a 10,000 km trajectory and the longer time available for that heat to conduct to the interior of the RV means that this test did not replicate the heating environment a 10,000 km-range missile would have to withstand. The heat shield may in fact be sufficient to protect the warhead, but this test does not conclusively demonstrate that.

Congress Is Trying to Give the Trump Administration a Short Cut to Ignore Public Input and Science: It Shouldn’t

UCS Blog - The Equation (text only) -

There is no question that elections matter. We follow the process and accept the results even if that results in many, many battles over the direction of the country. The election of Donald Trump and the 115th Congress seems to be a watershed moment for the country in many ways, but that doesn’t mean the rule of law or the fundamental principles of our democracy have gone away. Or have they?

I am not referring to the big controversies we see every night on the news about foreign interference in the election, or questions of conflicts of interest surrounding the President and many of his appointed officials. Those public debates are important and often seem to suck all the oxygen out of the room.

I am talking about more obscure actions buried away in the federal appropriations process that would literally set aside the ability of the public to be involved in making a key public policy decision and set aside the need to justify that decision based on good science. Sound scary? Damn right.

A dangerous appropriations bill “rider”

Just before the 4th of July recess, the House of Representatives Appropriations Subcommittee on Energy and Water Development included a provision to their spending bill to allow the Administration to withdraw the Obama-era Clean Water rule “without regard to any provision of statute or regulation that establishes a requirement for such withdrawal.” If that provision becomes law, it would allow Scott Pruitt’s Environmental Protection Agency (EPA) and the Army Corps of Engineers to ignore legal requirements that the public be informed about and have an opportunity to comment on changes to a regulation. It means the Administration would be authorized to set aside scientific analysis that was developed as the basis for the current Clean Water rule without specifically addressing any of the scientific evidence. Pretty scary for six lines unrelated to government spending tucked into a complicated spending bill.

Why would anyone want to do this? Is the Clean Water rule so very different from all other public policies that we should waive the law including the Administrative Procedures Act which has applied to all sorts of public policy for more than 60 years, as well as the requirement to base rules, whether they be implementing or changing regulations, on good science? Actually, no it isn’t. But the Clean Water rule has become a cause célèbre for many conservative politicians, a poster child for so-called “government overreach” or “bureaucratic excess.” Unfortunately, as is all too common, much of the overheated opposition to the rule is based on a false narrative.

The Clean Water Rule

The Clean Water Rule defines the scope of the Clean Water Act (CWA) by clarifying which bodies of water are considered in implementing the Act, through which Congress mandated the nation’s efforts to “restore and maintain the chemical, physical, and biological integrity of the Nation’s waters.” In 2006, the Supreme Court weighed in with a split decision that also instructed the EPA to clarify the scope of their CWA efforts. So the Obama Administration did just that—after years of additional analysis, proposals, and hundreds of thousands of public comments. The resulting rule expands the footprint of CWA actions by about 3 percent. In other words, it requires federal agencies and state and tribal partners to consider pollution impacts on waters and wetlands connected to larger “navigable” water bodies in some additional circumstances.

So who is so opposed? Developers, because it means they may need to be more careful in ensuring their activities don’t pollute the lakes and rivers we rely upon for our drinking water supply or for recreational activities like fishing and swimming. Mr. Trump issued an Executive Order to withdraw the rule. The Pruitt EPA has proposed going back to the pre-2006 definition, which they believe is clearer, despite the Supreme Court saying it was that very approach that was causing confusion. Clearly this is a complicated and controversial public policy decision.

Now, the Appropriations Subcommittee language says, let’s cut the public out of the process and make it easier for Pruitt to listen to his friends from industry. Let’s forget about the science. Let’s ignore anyone whose opinion we don’t like or whose evidence doesn’t support our views.

No short cuts

Outraged? You bet I am! I hope you are too. I believe this administration should have to follow the law just as previous administrations have done. Want to change the rules? Fine, propose an alternative, present your factual analysis, accept public input and explain your decision. And be subject to judicial review. No short cuts. Do your job. I think this Administration needs to hear that message loud and clear at every opportunity. And this Congress should not be adding harmful poison pill riders into must-pass spending bills.

Minnesota’s Solar Boom and… Bob Dylan?

UCS Blog - The Equation (text only) -

Those of us that track such things remember a time not long ago when the idea of a solar energy boom in Minnesota might have gotten you a funny look. But in a nod to Bob Dylan and his home state of Minnesota, I can only say: the times they are a-changin.

When Enel Green Power announced on June 27th that it had officially brought online a 150-megawatt (MW) solar project in Minnesota, it marked another big step forward in the state’s growth as a Midwest leader in the clean energy transition. The project is expected to generate enough electricity to power more than 17,000 homes and avoid more than 150,000 tons of carbon emissions annually. Minnesota has embraced solar’s potential, as both a driver of economic growth and a key part of the state’s strategy for addressing climate change.

But it also seemed to me a symbol of the relentless march of progress even as the Trump administration and fossil fuel special interests cling to the status quo and ignore the reality of climate change. And I couldn’t help contemplating Dylan’s ode to a changing world and warning to those that stand in the way:

Come senators, congressmen
Please heed the call
Don’t stand in the doorway
Don’t block up the hall
For he that gets hurt
Will be he who has stalled
There’s a battle outside
And it is ragin’.
It’ll soon shake your windows
And rattle your walls
For the times they are a-changin’.

 

Dylan’s song spoke for those fighting for equality in the 1960s, but maintains its relevance today as we continue to push forward for equal rights under the law and a just transition to a cleaner, safer, and more equitable energy future.

Solar PV is helping with this transition. In Minnesota, the rise of community solar is providing access to clean energy for low-income communities. And the increasing affordability of solar in the US and abroad means our transition away from fossil fuels can benefit everyone. “Solar for all” isn’t a catchphrase: it’s a future that we can create one step at a time in collaboration with our partners, and despite the naysayers that cling to the status quo.

And that list of naysayers continues to shrink. More than 30 studies have concluded that solar provides economic value to all consumers by improving reliability and reducing costs for ratepayers. Businesses are committing to solar as part of their efforts to stabilize and reduce energy costs while meeting sustainability goals. And utilities are increasingly recognizing the business case for investing in solar as an affordable and low risk option for meeting future electricity needs.

So yes, it’s clear to me that the times they are a-changin’ in ways that will benefits all of us. It’s clear in Bob Dylan’s home state of Minnesota and across the US.

Wikimedia

Nuclear Regulatory Commission: Contradictory Decisions Undermine Nuclear Safety

UCS Blog - All Things Nuclear (text only) -

As described in a recent All Things Nuclear commentary, one of the two emergency diesel generators (EDGs) for the Unit 3 reactor at the Palo Verde Nuclear Generation Station in Arizona was severely damaged during a test run on December 15, 2016. The operating license issued by the Nuclear Regulatory Commission (NRC) allowed the reactor to continue running for up to 10 days with one EDG out of service. Because the extensive damage required far longer than the 10 days provided in the operating license to repair, the owner asked the NRC for permission to continue operating Unit 3 for up to 62 days with only one EDG available. The NRC approved that request on January 4, 2017.

The NRC’s approval contradicted four other agency decisions on virtually the same issue.

Two of the four decisions also involved the Palo Verde reactors, so it’s not a case of the underlying requirements varying. And one of the four decisions was made afterwards, so it’s not a case of the underlying requirements changing over time. UCS requested that Hubert Bell, the NRC’s Inspector General, have his office investigate these five NRC decisions to determine whether they are consistent with regulations, policies, and practices and, if not, identify gaps that the NRC staff needs to close in order to make better decisions more often in the future.

Emergency Diesel Generator Safety Role

NRC’s safety regulations, specifically General Design Criteria 34 and 35 in Appendix A to 10 CFR Part 50, require that nuclear power reactors be designed to protect the public from postulated accidents such as the rupture of the largest diameter pipe connected to the reactor vessel that causes cooling water to rapidly drain away and impedes the flow of makeup cooling water. For reliability, an array of redundant emergency pumps—most powered by electricity but a few steam-driven—are installed. Reliability also requires redundant sources of electricity for these emergency pumps. At least two transmission lines must connect the reactor to its offsite electrical power grid and at least two onsite source of backup electrical power must be provided.  Emergency diesel generators are the onsite backup power sources at every U.S. nuclear power plant except one (Oconee in South Carolina which relies on backup power from generators at a nearby hydroelectric dam).

Because, as the March 2011 earthquake in Japan demonstrated at Fukushima, all of the multiple connections to the offsite power grid could be disabled for the same reason, the NRC’s safety regulations require that postulated accidents be mitigated relying solely on emergency equipment powered from the onsite backup power sources. If electricity from the offsite power grid is available, workers are encouraged to use it. But the reactor must be designed to cope with accidents assuming that offsite power is not available.

The NRC’s safety regulations further require that reactors cope with postulated accidents assuming offsite power is not available and that one additional safety system malfunction or single operator mistake impairs the response. This single failure provision is the reason that Palo Verde and other U.S. nuclear power reactors have two or more EDGs per reactor.

Should a pipe connected to the reactor vessel break when offsite power is unavailable and a single failure disables one EDG, the remaining EDG(s) are designed to automatically startup and connect to in-plant electrical circuit within seconds. The array of motor-driven emergency pumps are then designed to automatically start and begin supplying makeup cooling water to the reactor vessel within a few more seconds. Computer studies are run to confirm that sufficient makeup flow is provided in time to prevent the reactor core from getting overheated and damaged.

Palo Verde: 62-Day EDG Outage Time Basis

In the safety evaluation issued with the January 4, 2017, amendment, the NRC staff wrote “Offsite power sources, and one train of onsite power source would continue to be available for the scenario of a loss-of-coolant-accident.” That statement contradicted NRC’s statements previously made about Palo Verde and DC Cook and subsequently made about the regulations themselves. Futhermore, this statement pretended that the regulations in General Design Criteria 34 and 35 simply do not exist.

Palo Verde: 2006 Precedent

On December 5, 2006, the NRC issued an amendment to the operating licenses for Palo Verde Units 1, 2, and 3 extending the EDG allowed outage time to 10 days from its original 72 hour limit. In the safety evaluation issued for this 2006 amendment, the NRC staff explicitly linked the reactor’s response to a loss of coolant accident with concurrent loss of offsite power:

During plant operation with both EDGs operable, if a LOOP [loss of offsite power] occurs, the ESF [engineered safeguards or emergency system] electrical loads are automatically and sequentially loaded to the EDGs in sufficient time to provide for safe reactor shutdown or to mitigate the consequences of a design-basis accident (DBA) such as a loss-of-coolant accident (LOCA).

Palo Verde: 2007 Precedent

On February 21, 2007, the NRC issued a White inspection finding for one of the EDGs on Palo Verde Unit 3 being non-functional for 18 days while the reactor operated (exceeding the 10 day allowed outage time provided by the December 2006 amendment.) The NRC determined the EDG impairment actually existed for a total of 58 days. The affected EDG was successfully tested 40 days into that period. Workers discovered a faulty part in the EDG 18 days later. The NRC assumed the EDG was non-functional between its last successful test run and replacement of the faulty part. Originally, the NRC staff estimated that the affected EDG has a 75 percent chance of successfully starting during the initial 40 days and a 0 percent chance of successfully starting during the final 18 days. Based on those assumptions, the NRC determined the risk to approach the White/Yellow inspection finding threshold. The owner contested the NRC’s preliminary assessment. The NRC’s final assessment and associated White inspection finding only considered the EDG’s unavailability during the final 18 days.

Fig. 1 (Source: NRC)

Somehow, the same NRC that estimated a risk rising to the White level for an EDG being unavailable for 18 days and a risk rising to the White/Yellow level for an additional 40 days of the EDG being impaired by 25 percent concluded that an EDG being unavailable for 62 days now had risk of Green or less. The inconsistency makes no sense. And it makes little safety.

DC Cook: 2015 Precedent

One of the two EDGs for the Unit 1 reactor at the DC Cook nuclear plant in Michigan was severely damaged during a test run on May 21, 2015. The owner applied to the NRC for a one-time amendment to the operating license to allow the reactor to continue running for up to 65 days while the EDG was repaired and restored to service.

The NRC asked the owner how the reactor would respond to a loss of coolant accident with a concurrent loss of offsite power and the single failure of the remaining EDG. In other words, the NRC asked how the reactor would comply with federal safety regulations.

The owner shut down the Unit 1 reactor and restarted it on July 29, 2015, after repairing its broken EDG.

Rulemaking: 2017 Subsequent

On January 26, 2017, the NRC staff asked their Chairman and Commissioners for permission to terminate a rulemaking effort initiated in 2008 seeking to revise federal regulations to decouple LOOP from LOCA. The NRC staff explained that their work to date had identified numerous safety issues about decoupling LOOP from LOCA. Rather than put words in the NRC’s mouth, I’ll quote from the NRC staff’s paper: “The NRC staff determined that these issues would need to be adequately addressed in order to complete a regulatory basis that could support a proposed LOOP/LOCA rulemaking. To complete a fully developed regulatory basis for the LOOP/LOCA rulemaking, the NRC staff would need to ensure that these areas of uncertainty are adequately addressed as part of the rulemaking activity.”

It’s baffling how the numerous issues that had to be resolved before the NRC staff could complete a regulatory basis for the LOOP/LOCA rulemaking would not also have to resolved before the NRC would approve running a reactor for months assuming that a LOOP/LOCA could not occur.

4 out of 5 Ain’t Safe Enough

In deciding whether a loss of offsite power event could be unlinked from a postulated loss of coolant accident, the NRC answered “no” four out of five times.

Fig. 2 (Source: UCS)

Four out of five may be enough when it comes to dentists who recommend sugarless gum, but it’s not nearly save enough when the lives of millions of Americans are at stake.

We are hopeful that the Inspector General will help the NRC do better in the future.

North Korea Appears to Launch Missile with 6,700 km Range

UCS Blog - All Things Nuclear (text only) -

Current reports of North Korea’s July 4 missile test say the missile had a range of “more that 930 km” (580 miles), and flew for 37 minutes (according to US Pacific Command).

A missile of that range would need to fly on a very highly lofted trajectory to have such a long flight time.

Assuming a range of 950 km, then a flight time of 37 minutes would require it to reach a maximum altitude of more than 2,800 km (1700 miles).

So if the reports are correct, that same missile could reach a maximum range of roughly 6,700 km (4,160 miles) on a standard trajectory.

That range would not be enough to reach the lower 48 states or the large islands of Hawaii, but would allow it to reach all of Alaska.

There is not enough information yet to determine whether this launch could be done with a modified version of the Hwasong-12 missile that was launched on May 14.

Missing from the President’s Iowa Speech: Praise for Wind Energy

UCS Blog - The Equation (text only) -

President Trump came to Cedar Rapids, Iowa last week for a rally and talked about putting solar panels on his border wall (even though there are 68.4 million better places for solar). Perhaps even more outrageous was how he bashed the wind industry that has invested nearly $14 billion in the state, and that employees thousands of workers in the operations, maintenance, construction, and manufacturing sectors. In 2016 alone, the industry supported over 8,000 direct and indirect jobs  in Iowa.

He also stated “we’re going to be strong for the future”. Well, Mr. President, we, and Iowans in particular, are going to be strong because of the growth of wind energy, not despite of it. Let’s look at all the ways Trump’s comments on wind are misinformed.

Wind is king in Iowa

According to the American Wind Energy Association (AWEA) Iowa is ranked second only to Texas for installed wind capacity in the United States. Wind makes up more than 36 percent of Iowa’s in-state energy production, with over 6,900 megawatts (MW) of installed capacity.

Politicians from both sides of the aisle in Iowa are supportive of wind energy. Wind is a reliable energy source, and it helps build a more reliable and balanced electricity portfolio. The electric industry knows this, as evidenced by MidAmerican Energy’s decision to invest $3.6 billion in developing wind energy, with the goal of eventually producing 100 percent of their energy from renewable sources.

The utility’s adoption of wind has helped make its rates among the lowest in the country. Because of low rates and clean energy sources, Iowa has also become an attractive state for tech companies such as Google and Microsoft. Businesses are committed to powering their companies with renewable energy, and working to greatly increase the demand for it.

The rise of wind

The truth is wind energy is on the rise, not just in Iowa but throughout the country, including heavy investment in the Midwest. Wind capacity has more than doubled in the United States since 2010, accounting for nearly one-third of all new generating capacity installed since 2007. And wind power now makes up 5.5 percent of the nation’s total electricity generation.

The rise of wind is also bringing economic development in areas that need it, with seventy percent of US turbines located in low-income rural areas.

Rural communities benefit

Wind projects provide extra income for farmers and ranchers in rural communities and have proven to be a boon to local school districts. Wind projects significantly increase local tax bases and may even increase property value.

Wind projects also produce lease payments, with an estimated $245 million a year in lease payments going to rural landowners a year. The steady income that comes with lease payments helps landowners and farmers when bad weather strikes or commodity prices fluctuate. Wind energy is the new cash crop in rural America.

Can’t stop won’t stop

Renewable energy prices are falling; investing in renewable energy just makes sense. And in the United States, wind industry jobs are on the rise, up 28 percent from 2015, with the industry producing approximately 102,500 jobs in 2016.

At the end of the day, despite all the negative attacks and rhetoric on clean energy in the recent months at the federal level, clean energy momentum is happening, especially in the Midwest—and it’s not going to stop.

Offshore Oil vs. Offshore Wind: Guess Where the Action Really Is

UCS Blog - The Equation (text only) -

There’s plenty of energy off our coasts. Too bad the Trump Administration is looking the wrong way.

Yesterday was a momentous one for offshore energy, but maybe not in the way that some folks think. Sure, the administration opened up for public comment its plan to offer new offshore oil/gas leases (even if industry might say, “Meh”). But much more important for our future economy—and our planet—was what happened to move US offshore wind forward, the latest in a line of notable recent happenings at home and abroad.

Credit: phault

Massachusetts offshore wind happenings

Massachusetts took an important step forward in having the state’s utilities ask wind developers to bid to supply some 400 megawatts of offshore wind capacity, enough to power almost 200,000 Bay State homes. The move, required under the state’s 2016 energy diversity law, is aimed at bringing in the first tranche of what will eventually be at least 1600 megawatts of offshore wind for those utilities’ customers.

It’s easy to be excited about another step toward adding such a powerful technology to our nation’s clean energy toolbox. For Massachusetts, getting the state out there looking for solid offshore wind projects and prices in a competitive way is a vital next step.

Economic development means grabbing hold of good, new areas for business and jobs. We’re already seeing US industry step up to the plate—including by readying the type of specialized ships that we’ll need to get those wind turbines where they need to be.

Tackling climate change and protecting our environment means investing in expanding low-carbon energy options in responsible ways. It’s telling that yesterday’s move has garnered very positive reactions from environmental groups like the National Wildlife Federation and the Conservation Law Foundation (who also produced this great infographic laying out the strong case for offshore wind in New England).

Leadership means not waiting for others to go first.

Yesterday’s step keeps Massachusetts firmly in contention when it comes to building a new industry on our shores, making a new carbon-free electricity source a reality, and leading on US offshore wind.

Offshore wind around the US, and globally

The Massachusetts move is just the latest in all kinds of noteworthy steps for this exciting technology. Here’s a sampling:

  • Maryland approves support for two projects: Last month two offshore wind projects earned approval from Maryland’s regulators. The projects total 368 megawatts, which’ll generate enough electricity for well over 100,000 homes. It could start to come online around 2020.
  • The first US offshore wind farm is making it real for our country: When the Block Island Wind Farm in Rhode Island went online late last year, it made history as the first offshore wind farm anywhere in the Americas. It also proved that seeing is believing, as boat tours take people to see the turbines up close and personal. Those include the important tradespeople who we’re going to need to make the next offshore wind farms happen—steelworkers, pipefitters, electricians, and more—and the politicians who are helping to create a welcoming environment.
  • Two big projects go online off Germany: Just this week, DONG Energy—the largest owner of offshore wind in the world, and one of the companies vying to supply Massachusetts—brought two more projects online 30 miles off Germany’s shores. The two projects’ 97 turbines add 582 megawatts to the global total of more than 15,000. And they help add to the excitement fueled by recent record-low prices for future European offshore wind projects.
  • Bigger turbines are under development: While land-based wind turbines are typically 2-3 megawatts each, open water off our shores provides opportunities for using bigger, more powerful turbines to bring the costs down. Recent offshore projects, including Block Island and the new German projects, use turbines of around 6 megawatts. But the latest turbine, unveiled earlier this month, is more than 50% more powerful. Still larger ones are on the way.
Investing in the future, not the past

So, where’s our offshore energy scene headed? There’s a good bet that offshore wind is going to grow to be an important piece of our energy mix. Even the Trump Administration seems to recognize the importance of this powerful new (new to the US) technology.

If we’re smart, we’ll make sure that happens, and quickly. Our country’s energy future does include offshore. But it’s wind, not oil.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs