Combined UCS Blogs

The Trump Administration and Children’s Health: An Early Progress Report

UCS Blog - The Equation (text only) -

Parents are used to getting progress reports on how their children are doing—from teachers at school and from health care providers who assess developmental milestones. Early indicators are important; they can identify problems early, trigger needed interventions, or provide welcome assurance that things are looking good.

The news has been full of what President Trump has been doing in the first 90 days of his administration. Let’s do a quick progress report on what he’s done for children’s health.

Big picture

One could start by thinking about what a repeal of the Affordable Care Act might have meant (or still might mean) for children’s health care, especially poor children. Or what Mr. Trump’s efforts to round up and deport undocumented people has meant for their health and well-being (shattering stories here and here). Or the many ways that Mr. Trump’s “2 for 1” executive order could impact children. Or how his agenda to roll back public protections (i.e., regulations) will affect the many determinants of health, like clean air, water, food. (Children are especially vulnerable to environmental conditions, as they take in more air, water, and food per pound than adults.)

And then, of course, there’s the administration’s efforts to sow doubt about climate science and roll back safeguards that limit harmful climate pollution. Here is what our nation’s pediatricians have to say about climate change and child health.

Closer look

These past two weeks give us another window into the administration’s stance on specific environmental threats to children’s health.

Strike one. Rejecting the conclusions of its own scientists, last week the EPA Administrator announced a “final agency action” that it would not ban a pesticide (chlorpyrifos) that poses a clear risk to child health. This after years of independent study and solid scientific evidence (here, here, here) that the pesticide poses a developmental risk to children. (In 2000, the EPA phased out its use around homes, schools, day care centers and other places where children might be exposed.)

And here’s the kicker: the next time the EPA is required to re-evaluate the safety of this pesticide is 2022! That means another five years of exposure to this widely-used pesticide that poses a clear risk to the developing brains of children.

Strike two: Last week, the EPA released its proposed budget for FY18, aligning its budget with President Trump’s war on the EPA. The proposal eliminates two agency programs that help protect children from exposure to lead, a potent neurotoxin. Not trim , not cut, but eliminate!

One program trains and certifies workers involved in lead abatement in buildings that may have lead-based paint. The other is a grant program to states and tribal jurisdictions that address lead-based paint. We can likely expect a groundswell of protest from the nation’s public health community, which well-understands the grave risks of lead exposure to children, and the singular value of primary prevention. They also recognize the important role that the EPA has played in protecting children’s health over the past two decades.

Strike three: Stand by. I fear it won’t be long.

The irony

I’m struck that I’m blogging about two well-recognized and highly researched environmental health threats to children during National Public Health Week, and at the same time that the Children’s Environmental Health Network (CEHN) is holding its 2017 research conference in Arlington, VA. Among other things addressed by these children’s health specialists: pesticides and lead!

What to do?

At the risk of repeating myself—(OK, I am)—we need to remember that our government, including Mr. Pruitt’s EPA, works for us. Our public health and the health of our children should override private interests. We have voice. We have science behind us. We need to speak up, show up, and let our leaders know that we are watching and that attempts to roll back public protections IS NOT ALL RIGHT.

UCS has tools and resources to help do that. Join us—and others—in this fight.

Photo: Petra Bensted/CC BY (Flickr)

Déjà vu all over again: Heartland Institute Peddling Misinformation to Teachers about Climate Change

UCS Blog - The Equation (text only) -

I have had the thrill of sharing the latest discoveries in the classroom with students who asked probing questions, when I was a faculty member of a University.  That journey of discovery is one that parents and family members delight in hearing about when students come home and share what they have found particularly intriguing.

What if the information the student shared was not based on the best available evidence?  Misinformation would begin to spread more widely.  If corrected, the student might distrust the teacher who may have not known the source material was compromised.

This scenario is not fiction.  It has happened and may still be occurring in some U.S. schools.  Anyone concerned about this can learn more with an update forthcoming from those who keep track – the National Center for Science Education (NCSE).

According to the NCSE, during October 2013 educators received a packet chock full of misinformation about climate change.  The report includes an abbreviation that looked similar to a highly respected source – the Intergovernmental Panel on Climate Change (IPCC) – for international climate assessments.

It has happened again (starting in March 2017).  Many teachers found a packet in their mailbox with a report from the same group that spread the misinformation back in October 2013.  This report has a “second edition” gold highlight with a cover image of water flowing over a dam and a misleading title.

Heartland Institute Report Cover Spring 2017 with "Not Science" stamp

“Not Science” stamp on top of the report cover mailed to teachers during spring 2017. The report misrepresents the fact that nearly all climate scientists agree about human-driven climate change.

The report runs counter to the agreement among scientists who publish on climate change in the peer-reviewed scientific literature. More than 97% of scientists agree that climate change is caused by human activities

The Heartland Institute is infamous for its rejection of climate science and unsavory tactics.  According to a reported statement by the CEO of Heartland Institute, they plan to keep sending out copies to educators over the weeks ahead.

If you see any student or teacher with this report or DVD please let NCSE know about it and share what you have learned to help stop the spread.

What is EPA’s Vehicle Lab, and Why Should I Care How It’s Funded?

UCS Blog - The Equation (text only) -

More details have been released about the Trump administration’s plans to cut funding to the Environmental Protection Agency (EPA).  In particular, it is nearly zeroing out the budget for the vehicles program, calling for the National Vehicle and Fuels Emission Laboratory (“Vehicle Lab”) in particular to be funded almost entirely by fees on industry “as quickly as possible” (i.e. as soon as never).  This could significantly undermine the enforcement of safeguards which protect American pocket books and public health from industry malfeasance, and it could put in jeopardy technical research that moves technology forward.

The Vehicle Lab plays a critical role in watchdogging industry

Portable emissions measurement system (PEMS) like the one used to uncover the Volkswagen scandal were developed by EPA researchers at the Vehicle Lab.

EPA’s Vehicle Lab, located in Ann Arbor, MI (Go Blue!), is responsible for certifying manufacturer compliance with its emissions standards—before any vehicle can be sold in the United States, it must be approved by the EPA.  EPA does not test every passenger vehicle model—the lab is under-resourced for such an endeavor.  Instead, it randomly selects vehicle models (about 15-20 percent annually) to assess the accuracy of manufacturers’ test results.  It also conducts its own investigations if any anomalous data is brought to its attention, e.g., by consumer groups or other advocacy organizations.

Just in the last couple of years alone, several manufacturers from across the industry have faced fines, or worse, thanks to this oversight:

Fiat-Chrysler—Its Jeep and Ram diesel vehicles are currently being investigated for violating the Clean Air Act.  While the case is ongoing, it represents an effort by EPA to step up its real-world emissions tests to ensure that vehicles are not polluting above what is legally allowed and public health is not being harmed.

Ford—For the 2013 and 2014 model years, 6 different vehicles were required to adjust the fuel economy label information provided by consumers—for one of those (the C-MAX), this was actually the second such adjustment.  This resulted in payouts to consumers of up to $1050.

Hyundai and Kia—The Korean manufacturers were found to have systematically overstated fuel economy results for over 1 million vehicles, largely the result of violating EPA’s prescribed test guidelines for determining vehicle road load.  This led to a $100 million fine and hundreds of millions of dollars in compensation for its customers.

Volkswagen—The reintroduction of diesels to its American fleet were found to come only as the result of a defeat device used to cheat the emissions tests.  Encompassing nearly 600,000 vehicles, it turns out that in the real world these vehicles emitted up to 40 times the legal limit of nitrogen oxides, a smog-forming pollutant.  Volkswagen is estimated to spend around $20 billion over the next few years in an effort to remove these polluting vehicles from the road, mitigate the excess pollution caused by these vehicles, and compensate the American people for this egregious violation.

The above issues represent a real cost to consumers, the environment, and public health and they required rigorous laboratory and on-road testing to investigate the issue.  If anything, these recent enforcement actions by EPA show the need and value of investing in even more complementary real-world testing, not less. It seems absurd to cut in half the number of staff at the lab responsible for these tests.

The Lab has also been a vital tool for transparent assessment of vehicle regulation

In addition to its important role as industry watchdog, the Lab has played a key role in assessing the technological capability of the automotive industry and providing transparency to the development of fuel economy and emissions standards.

Throughout the regulatory process, the EPA has used the capabilities of the Vehicle Lab to assess the technology landscape, publishing its results and making freely available pages upon pages of detailed technical information.  This data was used not just to test the technologies of today but to actually create, develop, and benchmark a publicly accessible full vehicle simulation model to simulate the technologies of tomorrow.  This is the type of tool previously only available to manufacturers and some well-funded institutions and, until now, well out of the budget of an organization like UCS.

This wealth of information can help inform researchers like myself and others looking to promote improvements and investments in technologies to reduce fuel use, and it provides an unparalleled level of detail and transparency for assessing the validity of regulations based on this information.

In a comprehensive report, the National Research Council of the National Academies of Science, Engineering, and Medicine noted that “the use of full vehicle simulation modeling in combination with lumped parameter modeling has improved the Agencies’ estimation of fuel economy impacts.  Increased vehicle testing has also provided input and calibration data for these models.  Similarly the use of teardown studies has improved [NHTSA and EPA’s] estimates of costs.”

Every single item lauded by the National Academies was conducted in collaboration with the researchers at the Vehicle Lab the Trump administration is now proposing to gut.

Cutting funding cuts corners, jobs and puts us at risk of a rubber stamp EPA

The current administration plan would immediately cut the number of people working at the Lab in half—that means that rather than increasing the ability for the agency to protect against the types of industry malfeasance documented above, the Lab would be stripped of its capabilities in the near-term.  This reduction in workforce would make it impossible to even maintain the bare minimum of checks and balances on the certification program, even if (big IF!) it were eventually fully funded by fees from manufacturers.

This vehicle test cell is used to measure a vehicle’s emissions in order to assess its operation under cold weather conditions. This is a necessary component to ensure that pollution levels under all driving conditions are below legal limits, and fuel usage under these conditions is part of the test procedure which determines a vehicle’s fuel economy label for consumers.

Furthermore, the fee proposal in the budget is completely inadequate to the task.  While the EPA already collects fees to reimburse the Agency, in part, for its certification activities, it is Congress which determines how the fees are appropriated—to date, Congress has not been appropriating this money to EPA, instead using these funds to offset the federal budget deficit.  There is no reason to suppose that this would change in the future, which means this proposal would effectively gut the certification process by cutting the staff responsible for the program in half.

With such a drastic staff reduction, effective immediately in 2018, the certification process will be gummed up to such a degree it will either delay sales of vehicles tremendously or become a meaningless rubber stamp which will undoubtedly lead to even more automaker malfeasance, further eroding the trust of the American people in its auto industry.

Ensuring a technically sound watchdog is of course in the interest of the auto industry as well.  It ensures everyone is playing by the same rules and that they suffer the consequences if they don’t. While engineers at other auto companies were working hard to develop emission controls for diesel cars, VW was making millions, selling so-called “clean diesels” by the hundreds of thousands.

So I hope the Alliance of Automobile Manufacturers and the Association of Global Automakers call out this farcical budget memo for what it is—a slap in the face of good governance that can only result in adverse health and environmental impacts for the American people and end up a costly mistake for the auto industry as well.


Five Black Public Health Champions You Should Know

UCS Blog - The Equation (text only) -

In honor of National Public Health Week, we’re paying tribute to some outstanding individuals in the public health field. But first—bear with me—a little historical context.

It’s no secret that here at UCS, we love science. It can help us define complex problems, identify the best methods to solve them, and (if we’ve done a good job) provide us with metrics for measuring the progress we’ve made.

Doctor injects subject with placebo as part of the Tuskegee Syphilis Study. Photo: National Archives and Records Administration.

But it would be both irresponsible and incredibly destructive to pretend that science operates in isolation from systems of deeply rooted racism and oppression that plague scientific, political, and cultural institutions in the United States—particularly when it comes to health. Such systems have been used to justify unfathomably cruel and inhumane medical experimentation performed on black bodies in slavery, which were only replaced in the Jim Crow era by pervasive medical mistreatment that resulted in untold fatalities. Racist medical practices were tolerated, if not explicitly condoned, by professional organizations such as the American Medical Association through the late 1960s. The government-funded Tuskegee Syphilis Study, which effectively denied syphilis treatment to nearly 400 black men over the course of 40 years, ended in 1972, but a formal apology was not issued for this deliberate violation of human rights until 1997. And still, in doctors’ offices and hospital rooms across the United States today, race remains a significant predictor of the quality of healthcare a person will receive.

This is, of course, deeply troubling. (And worthy of far deeper discussions than a blog post can provide—see a short list of book recommendations below.)

But perhaps just as troubling as the underpinnings of racism in science and medicine is its relative obscurity in the historical narratives propagated by dominant (read: white) culture. That modern medicine was built on the backs of marginalized populations is well understood and indeed has been lived by many, but it is far from being accepted as universal truth. Meanwhile, the contributions of black scientists, doctors, and health advocates have routinely been eclipsed by those of their white colleagues or are absent entirely from historical records. (At least until Hollywood spots a blockbuster.)

Public health advocates and practitioners have a responsibility both to understand this complex history of medical racism, if they have not already experienced it firsthand, and to thoroughly integrate its implications into their daily work. This includes acknowledging the tensions that may stem from deep distrust of the medical community by communities of color; considering the multiple ways in which implicit bias and institutional racism may impact social determinants of health, risk of chronic disease, access to care, and quality of treatment; applying a racial equity lens to policy and program decision-making; and, last but not least, giving credit where it’s due.

Today, my focus is on that last point. Though public health is not necessarily a discipline that generates fame or notoriety (it has been said, in fact, that public health is only discussed when it is in jeopardy), you should know the names of these five black public health champions. Some past, some present, some well-known and some less so, they are all powerful forces who have made significant contributions to this field.

Have other names we should know? Leave them in the comments.

1.  Dr. Regina Benjamin, former U.S. Surgeon General

Photo: United States Mission Geneva/CC BY SA (Flickr)

During the four-year term she served as the 18th U.S. Surgeon General (2009-2013), Regina Benjamin shifted the national focus on health from a treatment-based to a prevention-based perspective, highlighting the importance of lifestyle factors such as nutrition, physical activity, and stress management in the prevention of chronic disease. Other campaigns during Dr. Benjamin’s term targeted breastfeeding and baby-friendly hospitals, tobacco use prevention among youth and young adults, healthy housing conditions, and suicide prevention. Prior to serving as the Surgeon General, Dr. Benjamin established the Bayou La Batre Rural Health Clinic on the Gulf Coast of Louisiana, providing care for patients on a sliding payment scale and even covering some medical expenses out of her own pocket. Dr. Benjamin has been widely recognized for her determination and humanitarian spirit.

2.  Byllye Avery, founder of the Black Women’s Health Imperative and Avery Institute for Social Change

Despite the passage of Roe v Wade in 1973, access to abortions remained limited in the years thereafter, particularly for many black women. Byllye Avery began helping women travel to New York to obtain abortions in the early 1970s, and in 1974 co-founded the Gainesville Women’s Health Center to expand critical access to abortions and other health care services. In 1983, Avery founded the National Black Women’s Health Project (now called the Black Women’s Health Imperative), a national organization committed to “defining, promoting, and maintaining the physical, mental, and emotional wellbeing of black women and their families.” Avery has received numerous awards for her work, including the Dorothy I. Height Lifetime Achievement Award (1995), the Ruth Bader Ginsberg Impact Award from the Chicago Foundation for Women (2008), and the Audre Lorde Spirit of Fire Award from the Fenway Health Center in Boston (2010).

3.  Bobby Seale, co-founder of the Black Panther Party

Photo: Peizes/CC BY SA (Flickr)

Here’s a name you might know—and a story that might surprise you. While the Black Panther Party, co-founded by Bobby Seale and Huey Newton in 1966, is often remembered for its radical political activism, the black nationalist organization was also deeply engaged in public health work. True to their rallying call to “serve the people body and soul,” the Black Panthers established over a dozen free community health clinics nationwide and implemented a free breakfast program for children. This program, which served its first meal out of a church in Oakland, California in 1968, was one of the first organized school breakfast programs in the country and quickly became a cornerstone of the party. By 1969, the Black Panthers were serving breakfast to 20,000 children in 19 cities around the country. Though the government eventually dismantled the program along with the party itself, many believe it was a driving factor in the establishment of the School Breakfast Program in 1975.

4.  Dr. Camara Jones, former president of the American Public Health Association

As the immediate past president of the American Public Health Association, Dr. Camara Jones brought the impact of racism on health and well-being to the forefront of the public health agenda. She initiated a National Campaign Against Racism, with three strategic goals: naming racism as a driver of social determinants of health; identifying the ways in which racism drives current and past policies and practices; and facilitating conversation, research, and interventions to address racism and improve population health. Dr. Jones has also published various frameworks and allegories, perhaps the most famous of which is Levels of Racism: A Theoretic Framework and a Gardener’s Tale, to help facilitate an understanding of the nuance and layers of racism across the general population.

5.  Malik Yakini, founder of the Detroit Black Community Food Security Network

Photo: W.K. Kellogg Foundation/CC BY SA (Flickr)

Malik Yakini may not see himself as a public health advocate, but that hasn’t stopped him from receiving speaking requests from prominent public health institutions across the country. A native Detroiter, Yakini views the food system as a place where inequities play out at the hand of racism, capitalism, and class divisions. “There can be no food justice without social justice,” he said to an audience at the Bloomberg School of Public Health at Johns Hopkins. “In cities like Detroit where the population is predominantly African American, we are seen as markets for inferior goods.” Yakini founded the Detroit Black Community Food Security Network in 2006 to ensure that Detroit communities could exercise sovereignty and self-determination in producing and consuming affordable, nutritious, and culturally appropriate food. The organization operates the seven-acre D-Town Farm on Detroit’s east side and is now in the process of establishing the Detroit Food People’s Co-op.

Recommended Reads

Black Man in a White Coat by Dr. Damon Tweedy

The Immortal Life of Henrietta Lacks

Body and Soul: The Black Panther Party and the Fight against Medical Discrimination


Made in America: Trump Embracing Offshore Wind?

UCS Blog - The Equation (text only) -

While publicly pushing fossil fuels, the Trump administration seems to be quietly embracing offshore wind power and its economic potential. 

In March, the Interior Department auctioned off 122,405 acres of water off Kitty Hawk, North Carolina, to the Spanish-based Avangrid for $9 million. Avangrid, a division of Iberdrola, beat out three competitors, including Norway’s Statoil and Germany wind farm developer wpd.

Interior Secretary Ryan Zinke hailed the auction, affirming that offshore wind is “one tool in the all-of-the-above energy toolbox that will help power America with domestic energy, securing energy independence, and bolstering the economy. This is a big win.”

That followed the equally stunning announcement a week prior by Interior’s Bureau of Ocean Energy Management that it plans to stage another competitive lease auction in 400,000 acres of New England waters, triggered by unsolicited applications for the same area by Statoil and the U.S. wing of Germany’s PNE Wind.

The parcels are adjacent or near areas off Massachusetts and Rhode Island already leased by Denmark’s DONG Energy, Germany’s OffshoreMW and Rhode Island’s Deepwater Wind.

Those two developments signal that the Trump administration takes the economic potential of offshore wind energy far more seriously than might be assumed from the president’s past disparagement of wind turbines. Trump told the New York Times shortly after his election, “We don’t make the windmills in the United States. They’re made in Germany and Japan.”

Already big business in US

But it may have dawned on the Trump administration that offshore wind is actually much more an American industry than most people realize.

In 2015, Boston-based General Electric made the biggest purchase in its history, acquiring the French energy infrastructure giant Alstom for $10.6 billion. The deal included Alstom’s offshore wind turbine manufacturing operations, including a plant in Saint Nazaire, France, that made the five turbines spinning in the U.S.’s first offshore wind farm, the 30-megawatt Deepwater Wind Block Island project.

GE proceeded last year to purchase the world’s largest turbine blade manufacturer, Danish-based LM Wind, for $1.65 billion. Last month, the LM Wind division announced it is building a blade manufacturing plant in the Normandy region of France, providing at least 550 direct and 2,000 indirect jobs as that nation ramps up its offshore industry. The factory will be capable of making the longest turbine blade in the world, nearly 300 feet long, for new-generation 8 MW turbines.

Besides GE, New York-based Blackstone, one of the world’s top investment firms, was behind the 2011 funding of Germany’s 80-turbine, 288-MW Meerwind offshore wind farm. Blackstone, with the help of Bank of America Merrill Lynch, last year sold its 80 percent stake to Chinese investors.

New York-based Goldman Sachs also has a 7 percent stake in DONG, the first company to cross the 1,000-turbine mark. Europe has a total of 3,600 turbines spinning, providing 12.6 Gigawatts of power, enough for 13 million homes, according to industry advocate Wind Europe.

Photo: Derrick Jackson

Critical mass close

It is clear that the offshore wind industry now wants to cross the water, with rocket-sized components that are too long and too massive to economically import long term from Europe. If it does, it could easily blow to our shores the skilled local construction and technical jobs and large-scale manufacturing President Trump has promised.

Deepwater Wind was recently cleared to begin a 15-turbine project off Montauk, Long Island, in waters where Deepwater could eventually construct up to 200 turbines. In December, Statoil won the federal lease for a 79,000-acre area of ocean off Long Island’s Jones Beach for a record $42.5 million.

Besides the competition in North Carolina, Maryland is in the approval stage of offshore wind proposals. And with Massachusetts now mandating 1,600 MW of offshore wind in its energy portfolio by 2027 and with New York Governor Andrew Cuomo pushing for 2,400 MW of offshore wind by 2030, the U.S. is about to become part of “the brightest spot in the global clean energy investment picture,” as Bloomberg New Energy Finance put it.

Job engine, port revivals

The inspiration points in Europe are endless. Last year saw a record $26 billion of investments, as the industry is on track to double the 12.6 GW by 2020.

The United Kingdom has approved construction of the largest wind farm yet, the 174-turbine, 1.2 GW Hornsea One Array. DONG says it expects Hornsea to generate 2,000 jobs during the construction phase and 300 operational jobs thereafter.

DONG and the British government have begun planning a second wind Hornsea wind farm that would be even bigger, 300 turbines and 1.8 GW, adding another 2,000 construction and nearly 600 maintenance jobs.

In Germany, the offshore wind industry is responsible for nearly halving unemployment in Bremerhaven and Cuxhaven, towns northwest of Hamburg, were hit hard in the late 20th century by the decline of fishing, shipbuilding and the closing of US military facilities. Local officials likened Bremerhaven to Detroit for its 25-percent unemployment rate.

Today, with a downtown core gleaming with new museums and hotels, those same officials call offshore wind their regional “moon shot.” Up in Cuxhaven, Siemens is putting the finishing touches on a giant turbine plant that should go into operation in the middle of this year, bringing yet 1,000 more jobs to the region, adding to the 20,000 jobs claimed by the German offshore wind industry.

Denmark, despite having only the population of Massachusetts, remains a per-capita titan in offshore employment with 10,000 jobs. The UK, which has 41 percent of Europe’s installed capacity, had at least 30,000 direct and indirect jobs, according to UK Renewables and is obviously adding thousands more with oncoming projects such as Hornsea.

In December, Siemens completed a $381 million turbine-blade plant in Hull that will employ 1,000 people when fully operational.

Unlike much of modern manufacturing, The Guardian’s story on the plant’s opening noted: “Surprisingly, the manufacturing process is almost entirely done by hand, rather than robots. The workforce includes former supermarket workers, aerospace industry experts on second careers and builders who learned fiberglass skills locally from fitting bathrooms and making caravan parts.”

And Hull and other British port towns, according to newspaper features, are experiencing rebounds akin to Bremerhaven and Cuxhaven. A January Sunday Express story recalled how Hull declined from the overfishing of cod into a “rundown backwater” that topped the list of worst places in the UK to live in 2003. With redevelopment strategies that included the investment of offshore companies like Siemens, the city has rebounded to be a popular tourist destination.

Grimsby, a 33-mile drive from Hull, already has 1,500 offshore wind jobs and, with the planned Hornsea projects, has plans to grow and become the biggest offshore wind industry cluster in the world. DONG said in 2015 it plans to invest $7.4 billion in the Grimsby/Hull region by 2019.

Elsewhere in the UK, another massive offshore wind project, the 102-turbine, 714 MW East Anglia One, promises 3,000 jobs.

Photo: Derrick Jackson

The American potential

The building of house-sized nacelles and football-field-length blades, the manufacture and laying of miles of underwater cables, the building of jack-up installation barges and maintenance vessels, the welding of foundations and towers, port rehabilitation and all the nuts and bolts in between should rise from its current 75,000 jobs to between 170,000 and 204,000 jobs, according to Wind Europe.

A recent New York Times feature on the industry said, “Offshore wind, once a fringe investment, with limited scope and reliant on government subsidies, is moving into the mainstream.”

According a joint report last September by the Department of the Interior and the Department of Energy, a robust U.S. offshore wind industry could employ up to 34,000 workers by 2020, up to 80,000 by 2030 and up to 181,000 by 2050.

The industry would be making $440 million in annual lease payments and $680 million in annual property tax payments for local economies. Better still, a University of Delaware study last year calculated that just 2 GW of projects in the pipeline in Massachusetts waters would ignite such an efficient local industry supply chain that the price of offshore wind energy should be even with other energy options by 2030.

“At that point, the technology presumably could continue to compete on its own without any continuing legislation,” the study said.

Onshore bipartisan success

The onshore wind industry is now at such cost parity that it is booming across America, from liberal California to the conservative Great Plains and Texas. In fact, 80 percent of U.S. wind farms are in Republican congressional districts, according to the American Wind Energy Association.

Wind energy surpassed hydroelectric power in generating capacity for the first time last year.

According to AWEA, the U.S. counterpart to Wind Europe, there are now more than 500 blade and turbine factories and supply-chain manufacturing facilities making the 8,000 different parts that go into one machine.

Domestic wind industry jobs have crossed the 100,000 mark and the Bureau of Labor Statistics lists wind service technician as the fastest-growing job through 2024, with a current median pay of $51,050.

Wind service technicians are a huge reminder that this is an industry where many jobs are skilled working-class crafts that can be learned in technical colleges, providing a fresh employment pathway for individuals, families and low-income communities where 4-year college is often seen as unaffordable.

Despite his planned sweeping rollbacks of environmental regulations he decries as “job killing,” offshore wind offers exactly the kinds of jobs President Trump has said he would bring back to areas of America where other forms of manufacturing have disappeared.

The Perry Factor?

Another reason for optimism for offshore wind during the Trump administration is that Secretary of Energy Rick Perry oversaw Texas becoming the nation’s leader in onshore wind when he was governor.

Today, 12,000 turbines provide 13 percent of the state’s electricity, powering 4 million homes, and providing more than 24,000 jobs, according to AWEA.

The state’s transmission grid completed a $7 billion upgrade to accommodate wind. As governor, Perry boasted that Texas as a nation would rank sixth in the world in onshore wind installed capacity.

Late in his administration, he began to invest in offshore. In 2014, his Texas Emerging Technology Fund awarded $2.2 million to Texas A&M University to explore offshore wind. That grant was matched with $64 million of federal and industry research investments.

When Mr. Perry was confirmed as Energy Secretary, AWEA CEO Tom Kiernan said, “The Texas success story with wind power has now become a model for America … we look forward to working with him at the Department of Energy to keep this success story going.”

The first signs are that the success story will include offshore wind, spinning with jobs, and revitalizing towns dimmed with decline.

Without officially saying so, the Trump administration is deciding that the windmills can be made here after all.

This post first appeared on The Daily Climate.

Photo: Derrick Jackson Photo: Derrick Jackson Photo: Derrick Jackson

North Korea’s 5 April 2017 Missile Launch

UCS Blog - All Things Nuclear (text only) -

North Korea launched a missile from its east coast into the Sea of Japan at 6:12 am local time on March 5 (5:42 pm on April 4 US eastern time).

US Pacific Command initially identified it as a KN-15 missile, called Pukguksong-2 in North Korea, which is a two stage solid-fueled missile with an estimated range of 1,200 km based on its previous test in February.

Subsequently, however, Pacific Command said it believed the missile was instead an older Scud, and that it may have tumbled, or “pinwheeled,” during flight.

South Korean sources reported the missile flew only about 60 km before splashing down, and reached a maximum altitude of 189 km. And based on Pacific Command’s statement, the flight time was eight to nine minutes.

I used those numbers to investigate the trajectory with a computer model I have of several missiles.

Short-range Scud missile

I found that a Scud missile, with a nominal range of 300 km, could roughly match these numbers if the warhead was lightened somewhat (from 1,000 kg to about 700 kg) and if it was launched on a very lofted trajectory, with a burnout angle only about 5 degrees from vertical. On a 300-km range trajectory, this angle would be roughly 45 degrees (see Fig. 1).

If the missile did not tumble during reentry, I calculate the flight time would be about 7.5 minutes. However, taking account of the additional atmospheric drag due to the tumbling body can increase the flight to about 9 minutes.

Fig. 1

Other possibilities

In the calculation above, the Scud burns to completion and then begins to pinwheel (the short-range Scud does not separate the warhead from the missile body at burnout).

Longer range missiles could also follow this trajectory if the engines failed partway through powered flight, as long as the missile was on a highly lofted trajectory (5 degrees from vertical) and stopped accelerating after reaching a speed of 1.7-1.8 km/s. It may have been an engine failure that caused the missile to tumble. If the engines did not burn to completion, the warhead may have remained attached to the missile body even for a longer range missile that would separate the warhead under normal operation.

The fact that the missile flew on a nearly vertical trajectory suggests there may have been a problem with the guidance system. If the missile was liquid fueled, North Korea may have shut down the engine when it realized there was a problem. A solid fueled engine could not be shut down in the same way.

Is the missile had been a KN-15, the engine would have had to fail about halfway through the burn of the second stage engine. It seems surprising that initial reports identified the missile as a KN-15, since I would have expected sensors could tell whether or not the missile had undergone staging. In addition, the plumes from liquid and solid missiles are different in appearance, so depending on what sensors viewed the launch they should have been able to differentiate a liquid from solid missile. Analyzing these issues may have been what led Pacific Command to change its mind about what type of missile was launched.

Why fire a Scud?

If Pyongyang decided to launch a missile to attract attention in advance of the Trump-Xi summit that starts tomorrow, it may have decided to launch some type of Scud because these are well tested and it could be relatively assured the launch would be successful. The missile may have been a Scud-ER like the four it launched simultaneously in early March.

That fact that it appears to have failed illustrates how uncertain the missile business can be.

Despite Trump’s Climate Rollbacks, Renewables Charging Full Steam Ahead

UCS Blog - The Equation (text only) -

President Trump’s recent Executive Order on Energy Independence is a cynical and dangerous assault on common sense policies to address climate change. His efforts will put Americans in harm’s way, and we must resist the president’s anti-science agenda at every turn. One of those turns is in our nation’s power sector, where the transition away from coal and toward cleaner, lower-carbon energy resources is well underway. Solar and wind power, especially, have experienced record growth in recent years, and there are multiple avenues—through utilities, states, corporations, and individuals—to keep the momentum going, with or without President Trump’s support.

It’s the market, stupid

Non-hydro renewable energy sources accounted for nearly 9 percent of our nation’s power supply in 2016, more than double 2010 levels. Since 2010, more than 86,500 megawatts (MW) of new wind and solar power capacity has come online, far more than their fossil fuel competitors. In fact, 2016 marked the first year that more solar power capacity was installed—14,762 MW—than any other power source.

Much of this rapid development has been aided by state policies and federal incentives, but simple market economics is playing an increasingly important role. Costs for wind and solar have dropped so dramatically in recent years that a recent comparison of power sources shows new wind and solar to be cheaper than new fossil fuel generation. As a result, more and more utility planners are opting to add renewables—and close aging coal generators—based largely on economics.

Consider Xcel Energy’s recent announcement to build 11 wind projects in seven states, totaling 3,380 MW of new capacity. In a statement Xcel executive David Hudson said, “The decision to add additional wind generation is purely in the economic interest of our customers.”

New Mexico’s largest utility, PNM, also recently released an analysis showing that closing their San Juan coal plant would result in “long-term benefits for consumers” and provide “an opportunity to increase renewable energy production.”

And in Ohio, Dayton Power & Light announced in March it will close two coal plants because they “will not be economically viable beyond mid-2018.” The utility also plans to invest in at least 300 MW in new wind and solar projects over the next five years.

‘Yuge’ competition among states

In addition to today’s market forces, policy drivers have been—and will continue to be—critical to ensure the swift transition to a renewable energy economy. And with the Trump Administration laying waste to federal solutions, the onus on states to step up and deliver has never been greater. Fortunately, many states are rising to the challenge through increasingly stronger renewable electricity standards (RES).

Indeed, there is stiff competition brewing among states to be a national leader in terms of commitment to renewable energy development. Just a few years ago, having a target of 25 to 30 percent of its electricity coming from renewable sources would put a state among the pack of leaders. Today, six of the 29 states with existing RES policies have requirements of at least 50 percent, including Hawaii, which has set its sights on achieving 100 percent renewables by 2045.

During this legislative season, at least eight states have actively pursued significantly stronger targets. Among them are three states—California, New York, and Massachusetts—that are seeking to match Hawaii’s 100 percent target. Even in a more conservative state like Nevada, legislators are considering an increase in their RES from 20 percent to 50 percent by 2030.

If successful, these collective state actions will help ensure there is a robust market for renewables over the long term.

This Bud’s for you!

It’s not just states and prudent utilities that are driving the renewable energy revolution. Corporate demand for renewables is also a rapidly expanding market opportunity in the clean energy industry. In 2015, corporate power purchase agreements for wind outpaced new wind investments by utilities for the first time in the United States, according to the Rocky Mountain Institute (RMI). RMI also estimates that at least 60,000 MW of new wind and solar will be needed by 2025 to serve the US corporate market.

Competitive pricing and increasingly stringent sustainability goals are leading many of the largest U.S. (and global) corporations to invest directly in renewable energy. A recent Advanced Energy Economy survey found that nearly half of all Fortune 500 companies (and 70 percent of Fortune 100 companies) have set renewable energy or sustainability targets. Of this list, at least 23 corporations have set renewable energy goals of 100 percent, including giants like Amazon and Walmart.

Anheuser-Busch InBev, makers of Budweiser beer, has joined the growing list of companies committing to sourcing 100 percent of their power needs from renewable energy. Photo: Jack Snell CC BY-NC-SA 2.0

The latest multi-national company to make a 100 percent renewable energy commitment is Anheuser-Busch InBev, makers of Budweiser and Corona beers, among others. In rolling out its announcement, the company said, “We do not expect our cost base to increase. Renewable electricity is competitive with or cheaper than traditional forms of electricity in many markets.” We can all raise our glasses to that!

(Renewable) Power to the People

Citizens all across the country also have the power to stand up against the President’s climate rollbacks and demonstrate their support for renewable energy. Thanks to a combination of falling costs and state and federal incentives, solar PV installations in the residential sector have experienced steady growth over the last six years. At the end of 2016, there were 1.3 million solar households in the United States, more than twice the number from 2014! California leads all states with a 35 percent share of the solar PV market, but all states have solar homes and tremendous potential to grow.

What’s more, you don’t need to be a homeowner to get in on the renewable energy revolution. Community solar is an exciting and burgeoning option for consumers where investing in a rooftop system may not be a viable option. In addition, anyone can sign-up for certified green power either through their utility’s green power pricing program (if they have one) or through a national green power marketer.

Despite President Trump’s misguided actions to undermine climate progress, we must keep pressing forward toward a clean and low-carbon energy future. Thanks to the emergence of wind and solar as affordable and reliable sources of power, we can.

The Importance of Public Funding for Earthquake Hazard Research in Cascadia

UCS Blog - The Equation (text only) -

In 2015, the New Yorker published “The Really Big One”, a story that brought public awareness to the dangers posed by the Cascadia subduction zone. The Cascadia subduction zone is a large fault that lies underwater, just off the coasts of Washington, Oregon, and Northern California. As a scientist and professor who researches this fault and its dangers, I really appreciated the large impact this article had in raising awareness of the importance of preparing for the next large earthquake here, especially among the many residents who live in this region. The New Yorker article, and plenty of ongoing scientific research, suggests that we need to prepare for the possibility of a major earthquake in this region—but we also need more research to help with this preparation.

Weighing the probabilities of earthquakes—room for uncertainty

Loma Prieta Earthquake damage on the Bay Bridge in California, 1989. Credit: Joe Lewis

The Cascadia subduction zone has the capacity for a magnitude 9.0 earthquake, the same size as the devastating Japanese earthquake that occurred in 2011. The 2011 Japan earthquake caused a large tsunami, widespread destruction, and an ongoing nuclear disaster. We expect the next great Cascadia earthquake will have similar effects, hopefully minus the nuclear disaster. This fault directly threatens the urban areas of Seattle, Washington and Portland, Oregon, in addition to the many more residents in rural and suburban areas of California, Oregon, and Washington. In a 2013 report, The Cascadia Region Working Group estimates that if a magnitude 9.0 earthquake were to happen in the near future in this region, “the number of deaths could exceed 10,000”, and “more than 30,000 people could be injured”, with economic losses “upwards of $70 billion”.

It is very difficult to predict when this next great Cascadia earthquake will occur. A recent report published by the U. S. Geologic Survey estimates the probability of a magnitude 9.0 earthquake at roughly 10% in the next 50 years. The probability of a somewhat smaller, but still very destructive earthquake in the southern section of Cascadia (located just offshore, stretching from Cape Mendocino, CA to Florence, OR) is roughly 40% over the same timeframe.  These probabilities are high enough to be scary—and to indicate the urgency of preparing for a a major earthquake disaster in this region.

These probability numbers represent decades of scientific progress and breakthroughs in studies of fault behavior, but they are not as useful as they could be. What the public and emergency managers want to know is “Will a destructive earthquake occur in the next 50 years, or not?”. The best answer we currently have is these probabilities. What that really means is, “we don’t know, so prepare just in case”.

While the New Yorker article raised awareness, over time this fades and people go about their usual lives. It is really difficult to maintain vigilance making sure you are personally prepared for a major earthquake at all times for the next 50 years, especially when there’s a good chance nothing will happen. Therefore, it would be really great to put some more certainty in those probabilities. If we can revise these probabilities closer to 0% (no chance of an earthquake) or 100% (definitely going to be an earthquake) we can reduce uncertainty when planning for the future.

The public depends on earthquake research

EarthScope infrastructure across the United States. Credit: Jeffrey Freymueller

Increased certainty can only come from increased scientific understanding of this fault, and the mechanics of faults in general, which is at best only partially understood. We are also monitoring this fault for long-term changes that might indicate a large earthquake is imminent.

Making progress improving earthquake forecasts for Cascadia is a multi-disciplinary research problem. Scientists like myself use techniques such as numerical models of friction on faults to study the rupture process, laboratory experiments to study fault behavior, field geology studies to look at the signatures of past earthquakes, and data-driven studies using multiple instruments planted all along the subduction zone.

The vast majority of these studies are publicly funded using federal funding from the U.S. Geological Survey and National Science Foundation. The instruments we use were placed as part of a major scientific initiative called Earthscope, which was featured by Popular Science as the #1 “Most Ambitious Experiment in the Universe Today”. Earthscope is funded completely by the National Science Foundation, and funding is scheduled to end soon. The future of the critical scientific instrumentation in Cascadia is currently uncertain. These instruments have been, and continue to be, vital in improving our understanding of the mechanics of the Cascadia subduction zone and the size and timing of the next large earthquake there.

Budget cuts and uncertainty have a large effect on this field. The U.S. Geological Survey, under the recently released Trump budget blueprint, is going to take a 15% cut. The National Science Foundation is not specifically mentioned in the blueprint, but the working assumption among scientists is a 10% cut. While the cuts certainly hinder our efforts to study the Cascadia subduction zone, even the uncertainty is a hindrance to this science, as funding proposals take 6 months or more to receive an answer because of budget uncertainty. For scientists to do our jobs and give emergency managers and the public the best available information, it is critical that we continue to receive federal research funding.


Noel M. Bartlow is an Assistant Professor in the Department of Geological Sciences at the University of Missouri. She is a geophysicist who studies slow earthquakes and frictional locking in subduction zones. She earned her Ph.D. in Geophysics from Stanford University in 2013, and completed a postdoctoral fellowship at the University of California–San Diego’s Scripps Institution of Oceanography before joining the University of Missouri faculty in 2016.  She is currently the principal investigator for the National Science Foundation funded project, “Collaborative Research: Improving models of interseismic locking and slow slip events in Cascadia and New Zealand.”

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.


Why Senator Lankford’s “BEST Act” Is Really the Worst for Federal Science

UCS Blog - The Equation (text only) -

A few weeks ago, Sen. James Lankford (OK) introduced legislation called the “Better Evaluation of Science and Technology Act,” or “BEST Act” for short. The proposal takes the scientific standards language from the recently updated Toxic Substances Control Act (TSCA) and applies it to the Administrative Procedures Act (which governs all federal rulemaking). Sen. Lankford claims the BEST Act would guarantee that federal agencies use the best available science to protect public health, safety, the environment, and more.

Nice sound bite, right?

In practice, though, this bill would cripple the ability of agencies like the Environmental Protection Agency (EPA) and the Consumer Product Safety Commission (CPSC) to rely on scientific evidence to issue public health and safety safeguards. It’s just as radical as the numerous other bills that would enable politics to trump science, making all of us more vulnerable to public health and environmental threats.

How this works in the real world

How would it do that? It’s simple really. If you look at the bill language carefully, it consists of significant legal jargon and imprecise language that any lawyer worth his or her salt could use to shut down science-based decision making at federal agencies by tying up the rule-making process in endless challenges.

Let’s take a look at lead. The science is clear here. There is no safe blood level of lead. According to the EPA, lead poisoning can cause slowed growth, lower IQ, behavior and learning problems, and more. In the 1970s, it became increasingly clear that lead exposure resulted in negative health effects.

Science is a critical component of policymaking, but Senator Lankford's BEST Act is a solution in search of a problem.

Science is a critical component of policymaking, but Senator Lankford’s BEST Act is a solution in search of a problem.

Rather than accept the growing weight of this scientific evidence, the lead industry started to use manufactured counterfeit science to cast doubt on the impacts of lead exposure and the acceptable amount of lead in blood.

Now if the “BEST Act” had been the law of the land when the federal government began to regulate lead, the lead industry could have used this counterfeit science to challenge EPA regulations on the grounds of “degree of clarity” and “variability and uncertainty” (among other things), forcing the agency into endless litigation over settled science. This could have ultimately prevented the agency from limiting lead exposure, especially among vulnerable populations like children.

Likewise, the tobacco industry would have been able to cast doubt on the link between cigarettes and lung cancer.

The list goes on. Today, you can imagine the fossil fuel industry using the vague language to attack climate science as a justification for slowing down solutions that prevent global warming.

Heavy on problems, light on solutions

The ambiguity of the text should be enough to realize that this legislation is bad news for evidence-based decisionmaking. But there are several other issues with the legislation as well.

One major concern is subsection (h), which would result in an enormous resource drain for agencies at a time when budgets are decreasing. Agencies would be required to divert additional resources to make public a number of documents and information, which, as we know from our fight against the HONEST Act, costs time and money.

Another major issue is the fact that this legislation would freeze science standards the way they are right now, killing the innovation and flexibility that agencies have now to consider new forms of research in their decisionmaking. As agencies begin to regulate new technologies like autonomous vehicles, they need to have the ability to consider the most cutting-edge research out there, which might include new scientific methods and models.

More importantly for human health, as the EPA looks to implement the updated chemical safety law, it needs to have the ability to utilize the best and most up-to-date scientific and technical information without having to worry about being sued, which was the problem with the original TSCA bill. Under the original chemical safety law passed in the 1970s, the EPA could not even regulate asbestos, a known carcinogen, because industry kept suing the agency. If the BEST Act were to become law, we could expect more of the same.

A wasted opportunity

Agencies are already basing their policy decisions on the best available science. They have to. If an agency did not issue a public health protection or a worker safety standard based on strong evidence, then the agency would be challenged in court, and probably forced to vacate the regulation.

Instead of promoting legislation like the BEST Act, what Sen. Lankford could do to improve the use of science in policymaking is ensure that agencies like the EPA, CPSC, Department of Energy, and others, are well funded and have the resources necessary to fulfill their respective science-based missions.

There is no disagreement among anyone (well, almost anyone) that science has an important role to play in federal policymaking and that the decisions made by agencies to implement the Clean Air Act, the Endangered Species Act, the Consumer Product Safety Act, and others, all need to be rooted in the best scientific and technical information that is available. We all want science to help ensure that our health and safety are protected, that the drugs and medical devices we use are safe and effective, that the food we eat is free of disease, that our drinking water is clean, and more.

If anything, the BEST Act would take science out of the hands of scientists, and into the hands of politicians, lawyers, and judges. Sen. Lankford’s legislation is misguided and simply a solution in search of a problem. While there is always more to learn about a scientific issue, the ideas in this proposal should not be used as an excuse not to act and protect the public from public health, safety, and environmental threats.

Leak at the Creek: Davis-Besse-like Cooling Leak Shuts Down Wolf Creek

UCS Blog - All Things Nuclear (text only) -

The Wolf Creek Generating Station near Burlington, Kansas has one Westinghouse four-loop pressurized water reactor that began operating in 1985. In the early morning hours of Friday, September 2, 2016, the reactor was operating at full power. A test completed at 4:08 am indicated that leakage into the containment from unidentified sources was 1.358 gallons per minute (gpm). The maximum regulatory limit for was such leakage was 1.0 gpm. If the test results were valid, the reactor had to be shut down within hours. Workers began running the test again to either confirm the excessive leak or determine whether it may have been a bad test. The computer collects data over a two-hour period and averages it to avoid false indications caused by momentary instrumentation spikes and other glitches. (It is standard industry practice to question test results suggesting problems but accept without question “good” test results.)

The retest results came in at 6:52 am and showed the unidentified leakage rate to be 0.521 gpm, within the legal limit. Nevertheless, management took the conservative step of entering the response procedure for excessive leakage. At 10 am, the operators began shutting down the reactor. They completed the shutdown by tripping the reactor from 30 percent power at 11:58 am.

Wolf Creek has three limits on reactor cooling water leakage. There’s a limit of 10 gpm from known sources, such as a tank that collects water seeping through valve gaskets. The source of such leakage is known and being monitored for protection against further degradation. There’s a stricter limit of 1 gpm from unknown sources. While such leakage is usually found to be from fairly benign sources, not knowing it to be so imposes a tighter limitation. Finally, there’s the strictest limit of zero leakage, not even an occasional drop or two, from the reactor coolant pressure boundary (i.e., leaks through a cracked pipe or reactor vessel weld. Reactor coolant pressure boundary leaks can propagate very quickly into very undesirable dimensions; hence, there’s no tolerance for them. Figure shows that the unknown leakage rate at Wolf Creek held steady around one-tenth (0.10) gallon per minute during July and August 2016 but significantly increase in early September.

Fig. 1 (Source: Freedom of Information Act response to Greenpeace)

The reactor core at Wolf Creek sits inside the reactor vessel made of metal six or more inches thick (see Figure 2). The reactor vessel sits inside the steel-reinforced concrete containment structure several feet thick. The dome-shaped top, or head, of the reactor vessel is bolted to its lower portion. Dozens of penetrations through the head permit connections between the control rods within the reactor core and their motors housed within a platform mounted on the head. Other penetrations allow temperature instruments inside the reactor vessel to send readings to gauges and computers outside it.

Fig. 2 (Source: Nuclear Regulatory Commission)

Wolf Creek has 78 penetrations through its reactor vessel head, including a small handful of spares. Workers entered containment after the reactor shut down looking for the source(s) of the leakage. They found cooling water spraying from penetration 77 atop the reactor vessel head. The leak sprayed water towards several other penetrations as shown in Figure 3. Penetration 77 allowed a thermocouple within the vessel to send its measurements to instrumentation.

Fig. 3 (Source: Wolf Creek Nuclear Operating Corporation)

The spray slowed and then stopped as the operators cooled the reactor water temperature below the boiling point. Workers performed a closer examination of the leakage source (see Figure 4) and its consequences. The reactor cooling water at Wolf Creek is borated. Boric acid is dissolved in the water to help control the nuclear chain reaction in the core as uranium fuel is consumed. Once water leaked from the vessel evaporated, boric acid crystals remained behind, looking somewhat like frost accumulation.

Fig. 4 (Source: Freedom of Information Act response to Greenpeace)

The spray from leaking Penetration 77 blanketed many neighbors with boric acid as shown in Figure 5. The vertical tubes are made from metal that resists corrosion by boric acid. The reactor vessel (the grayish dome-shaped object on the left side of the picture) is made from metal that is considerably less resistant to boric acid corrosion. The inner surface of the reactor vessel is coated with a thin layer of stainless steel for protection against boric acid. The outer surface is only protected when borated water doesn’t leak onto it.

Fig. 5 (Source: Freedom of Information Act response to Greenpeace)

The white-as-frost blankets coating the penetrations indicated little to no corrosion damage. But rust-colored residue in the Figure 6 pictures is a clear sign of corrosion degradation to the reactor vessel head by the boric acid. It may not be déjà vu all over again, but it’s too much Davis-Besse all over again. Boric acid corroded the Davis-Besse reactor head all the way down to the thin stainless steel liner. The NRC determined Davis-Besse to have come closer to an accident than any other US reactor since the March 1979 meltdown at Three Mile Island.

Fig. 6 (Source: Freedom of Information Act response to Greenpeace)

Fortunately, the degradation appears much worse in the pictures than it actually was. Actually, fortune had an ally at Wolf Creek that was missing at Davis-Besse. Both reactors exhibited signs that reactor cooling water was leaking into containment. The indicated leak rates at both reactors were below regulatory limits, except for one anomalous indication at Wolf Creek. Managers at Davis-Besse opted to dismiss the warning signs and keep the reactor operating. Managers at Wolf Creek heeded the danger signs and shut down the reactor. It’s not that they erred on the side of caution—putting nuclear safety first must never be considered an error. It’s that they avoided making the Davis-Besse mistake of putting production ahead of safety.

Wolf Creek restarted on November 21, 2016, after repairing Penetration 77, removing the boric acid, and verifying no significant damage to other penetrations and the reactor vessel head. But they also conducted refueling activities—already planned to require 55 days—during that 80-day period. The NRC closely monitored the response to the leakage and its repair and found no violations.

Davis-Besse chose production over safety but got neither. The reactor was shut down for over two years, generating no revenue but lots of costly repair bills. The reactor vessel head and other components inside the containment extensively damaged by boric acid corrosion were replaced. Many senior managers at the plant and in the corporate officers were also replaced. And the NRC fined the owner a record $5,450.000 fine for numerous safety violations.

Nuclear Safety Snapshot

Figure 7 shows the reactor vessel head at Wolf Creek without any boric acid blankets and corrosion. But the image I’ll remember about this event is neither this picture, nor the picture of the hole in Penetration 77, nor the picture of the boric acid blankets on adjacent penetrations, and nor the picture of rust-colored residue. It’s the mental picture of operators and managers at Wolf Creek who, when faced with Davis-Besse-like cooling water leak indications, responded unlike their counterparts by shutting the reactor down and fixing the problem rather than rationalizing it away. It’s an easy decision when viewed in hindsight but a tough one at the time it was made.

Davis-Besse made headlines, lots and lots of headlines, for exercising very poor judgment. Wolf Creek may not warrant headlines for using good judgment, but they at least deserve to be on the front page somewhere below the banner headline and feature article about today’s bad guys.

Fig. 7 (Source: Freedom of Information Act —response to Greenpeace)

Nuclear Safety Video

Unfortunately, the picture of Wolf Creek responding well to a safety challenge is a snapshot in time that does not assure success in facing tomorrow’s challenges.

Fortunately, the picture of Davis-Besse responding poorly to a safety challenge is also a snapshot in time that does not assure failure in facing future challenges.

Nuclear safety is dynamic, more like a video than a snapshot. That video is more likely to have a happy ending when the lessons of what worked well along with lessons from what didn’t work factor into decision-making. Being pulled away from bad choices is helpful. Being pushed towards good choices is helpful, too. Nuclear safety works best when both forces are applied.

The NRC and the nuclear industry made quite the hullabaloo about Davis-Besse. Why have they been so silent about Wolf Creek? It’s a swell snapshot that could help the video turn out swell, too.

Another Delay of Chemical Safety Rule Is Dangerous and Unwarranted

UCS Blog - The Equation (text only) -

Last week was just chock full of setbacks and assaults on our public protections coming out of Washington. You’ve probably heard about President Trump’s all-out attack on climate policy; EPA Administrator Pruitt got right on it.  No surprise there.  Then there was EPA’s decision not to ban a pesticide clearly linked to serious and long-term developmental effects on children’s brains and cognitive function.  But you may not have noticed another harmful decision coming out of the EPA – this one about its Risk Management Program (RMP) rule.

Maybe you have been fortunate enough NOT to have to worry about an explosion, fire, or leak from the over 12,000 facilities that use or store toxic chemicals in the U.S.  But many of our families and communities—especially communities of color or low income communities — are not so lucky.

In the last decade nearly 60 people died, approximately 17,000 people were injured or sought medical treatment, and almost 500,000 people were evacuated or sheltered-in-place as a result of accidental releases at chemical plants. Over the past 10 years, more than 1,500 incidents were reported causing over $2 billion in property damage.  According to whom? The EPA.  And these data don’t begin to capture the daily worry and anxiety of those living or working close to one of those facilities.

One would think that enhancing safeguards to prevent, prepare for, respond to, and manage risks of chemical accidents and releases from our nation’s most hazardous facilities would be a no brainer.  It’s not like we haven’t seen or read about catastrophic chemical incidents.  Like the Chevron Richmond Refinery fire in 2012 that sent 15,000 people to the hospital for emergency treatment.  Or the deadly explosion at the West Fertilizer Company in West, Texas in 2013 that killed 15 people and injured 200 more. Or the 2014 chemical spill in West Virginia that left thousands of residents and businesses without clean water.

I suspect the American public assumes our government views reducing the risk of chemical disasters as a critical priority. And it was making some progress.

The good

For years, community groups, environmental organizations, and labor groups had pressed and petitioned the federal government to adopt stronger measures to prevent chemical disasters.  Finally, and in the wake of several high profile incidents, President Obama issued an Executive Order (EO 13650) in 2013 directing the federal agencies to reduce risks associated with such incidents and to enhance the safety of chemical facilities. Updating EPA’s Risk Management Program rule (under the Clean Air Act’s chemical disaster provision) emerged as one of the top priorities for improving the safety of these facilities.  The EPA then embarked on a multi-year and rigorous process of public outreach, stakeholder engagement, formal requests for information, and notice and comment periods.  The outcome: an updated Risk Management Program rule that includes some common-sense provisions for covered facilities.  For example,

  • Investigating incidents that resulted in or could have resulted in a catastrophic release
    (a so-called “near miss”), including a root cause analysis;
  • Coordinating local emergency response plans, roles, and responsibilities, and conducting emergency response exercises;
  • Improving public access to chemical hazard information;
  • Engaging an independent third-party after a reportable accident to audit compliance; and
  • For three industries with the most serious accident records (oil refineries, chemical manufacturers, and pulp and paper mills), conducting a safer technology and alternative analysis to identify and evaluate measures that could prevent disasters.

The enhanced rule was scheduled to go into effect on March 14, 2017, with longer compliance periods for some provisions (as far out as 2022).

The bad

 In March, the EPA issued a 90 day administrative stay – delaying implementation of the rule to June 19, 2017. This followed receipt of petitions by industry groups and several states requesting reconsideration of the rule.  Who were these petitioners?  Some pretty powerful stakeholders.  The RMP Coalition whose members are … wait for it… the American Chemistry Council, the American Forest & Paper Association, the American Fuel & Petrochemical Manufacturers, the American Petroleum Institute, the U.S. Chamber of Commerce, the National Association of Manufacturers, and the Utility Air Regulatory Group.  Another petition came in from the Chemical Safety Advocacy Group (CSAG) – comprised of companies in the refining, oil, and gas, chemicals, and general manufacturing sectors.  Then came a third petition from 11 states, including Texas and West Virginia.

The ugly

Stay it again.  Attentive to these industrial interests, Mr. Pruitt’s EPA last week proposed a further delay to the effective date of the RMP amendments to February 19, 2019.  So, having waited years for enhanced chemical safety and security safeguards, and after an already lengthy and extensive public process required for rule-making, communities and families at risk of chemical disasters will now have to wait almost another two years while the agency reviews and reconsiders the Risk Management Program amendments.  This delay essentially buys the agency more time to figure out how to redo it or repeal it completely. Call me crazy, but I just don’t see the delay resulting in a rule that gets stronger and further strengthens public safety.  The regulated community doesn’t want that to happen, and they have a bigger war chest and easier access to regulators and decision makers than public interest community does.

Unleashing the power of the (little) people

 But here’s what we, the people, do have.  We have voice. We have votes.  We have on-the-ground stories to tell.  We also have local leaders, emergency responders, workers, and school teachers who can attest to the dangers and the need.

The EPA is holding a public hearing as part of its reconsideration on April 19, 2017 in Washington, DC.  And it is taking written comments until May 19, 2017.   No comment or story is too short or too unimportant to tell.  And EPA has to consider all comments as it fashions its response. Tell them a further delay is dangerous, unnecessary, and unconscionable.

You can submit written comments electronically to Docket ID No. EPA-HQ-OEM-2015-0725 at  These written comments can be accompanied by multi-media submissions, (i.e., video, audio, photos — like maybe of your kids?).  While you’re at it, send a copy of your comments to your federal representatives to let them know that you expect their support for strong chemical safety rules and resistance to any effort to roll-back these and other public protections.

One of the facilities in question may be in your neighborhood – or near those you love.  You might not even know. But even if you’re fortunate enough to be some distance away and relatively safe from a chemical explosion, fire, or spill disaster, we all have a stake in public safety and health.  And know that UCS will be there with you.



Trump Administration Claims ‘No Evidence’ Afterschool Programs and Meals Work. Actually, There’s Plenty.

UCS Blog - The Equation (text only) -

When I sat down with Dr. Jacqueline Blakely to talk about her afterschool program at Sampson Webber Academy in Detroit, our conversation was interrupted. A lot. Parents dropped by to talk about their kids, kids dropped in to talk about their days, and the phone rang like clockwork. It didn’t take long for me to understand that there was something really good going on in this classroom.

“The kids get a hot supper, followed by homework help and an academic hour focused on math and science, and then enrichment—that’s when they do projects,” Dr. Blakely explained. “They’re on ‘fun with engineering’ now, but we’ve done a cooking class, learned how to put a car together, and soon we’ll get to do the NASA challenge. That’s when the kids build an underwater robot and send it through an obstacle course.”

With that in mind, maybe you’ll understand why I winced when I heard White House Office of Management and Budget director Mick Mulvaney’s comments to the press about afterschool programs and the meals they provide. “They’re supposed to be educational programs, right? That’s what they’re supposed to do. They’re supposed to help kids who don’t get fed at home get fed so they do better in school,” he said. “Guess what? There’s no demonstrable evidence they’re actually doing that.”

Omia and Orari participate in the afterschool program at Sampson Academy in Detroit. Photo: Jacqueline Blakely

Sampson is a 21st Century Community Learning Center (CCLC), a grant-funded program providing 1.8 million children in high-poverty areas with academic, STEM, and cultural enrichment activities during out-of-school hours, as well as snacks and hot meals. According to the budget blueprint released by the Trump administration last month, funding for these programs is set to be eliminated.

But make no mistake—it’s not because they don’t work for kids.

On the contrary, the most recent national performance data for the 21st CCLC program revealed substantial improvements in both student achievement and behavior. Combined state data indicated that over a third of regular attendees (36.5 percent) achieved higher grades in mathematics through program participation, and a similar number (36.8 percent) achieved higher grades in English. Teachers reported that 21st CCLC students increased homework completion and class participation by nearly 50 percent, and over a third (37.2 percent) demonstrated improvements in behavior. Research from the Global Family Research Project supports the conclusion that sustained participation in afterschool programs can lead to greater academic achievement, improved social skills and self-esteem, decreased behavioral problems, and the development of positive health behaviors.

“Kids are getting experiences that schools like ours don’t have the money to provide,” says Dr. Blakely. “I have kids that walk two and three miles home afterwards because the bus doesn’t stay that late. They do that all winter long—that says a lot about this program.”

And about the meals—I don’t mean to insult anyone’s intelligence, but how much data do you need to prove that proper nutrition is important for learning and development?

From a 2014 report from the Center for Disease Control, titled Health and Academic Achievement: “Hunger due to insufficient food intake is associated with lower grades, higher rates of absenteeism, repeating a grade, and an inability to focus among students.” In addition to academic outcomes, food insecurity negatively correlates with measures of health status, emotional wellbeing, productivity, and behavior among school-aged children. There are scores of studies linking nutritional status with academic performance among youth.

Contrary to common assumptions about who is served by federal assistance programs, these issues don’t just affect students in urban areas like Detroit. Food insecurity affects 16 million children across the United States, and of U.S. counties with high child food insecurity rates, a majority (62 percent) are rural. Stripping funding from 21st CCLC programs will be felt deeply in many underserved communities, among them considerable segments of Trump’s own voter base.

I asked Dr. Blakely about her response to the proposed funding cuts. “It upsets me. It further marginalizes kids that are already marginalized, and it makes a bigger gap between the poor and the wealthy.” She paused. “It makes me angry, too. You already acknowledged that they don’t get food at home—so you know they need it. Why would you stop a program that feeds children?”

What this comes down to, regrettably, is yet another display of the administration complacently setting aside the needs of low- and middle-income families, urban and rural alike, to pursue its own agenda.  Afterschool programs may not work for the president’s budget, but there’s no question that they work for kids.

Solar Panels vs. Gromdars—Battle of the Century, or No Contest?

UCS Blog - The Equation (text only) -

Two technologies have been going head-to-head to capture the public’s imagination. Both represent wholly new ways of doing things, and both hold tremendous potential. But what’s the reality behind the headlines? Which one really deserves the limelight?

As with so many issues, facts and data are the way to find out. So, here you have it: Solar Panels versus Gromdars, 2017 edition.

Solar’s costs keep falling; can’t tell about Gromdars.

One obvious point of comparison between products such as solar and Gromdars is the cost trajectory for each.

Solar has made incredible strides in just the last few years. The costs of residential solar systems fell by more than half from 2009 to 2015, and fell another 17% last year. That’s incredibly good news for would-be customers.

Solar’s costs keep coming down (Data source: LBNL 2016).

In terms of data on Gromdars’ cost trajectory, all we’ve got is this, from the inventor:

‘The question isn’t whether you can afford to buy a Gromdar; the question is whether you can afford not to.’

Actually, the question is how much they cost, and whether they’re getting cheaper. I really think we’re going to need something more specific than that, data-wise, if we’re going to build it into our economy in a meaningful way.

Solar keeps spreading; Gromdars…?

Scale and costs go hand-in-hand. Gromdar, it seems, has a goal “not only put a Gromdar in each home, but in each room of each home.” The company’s initial surge reportedly saw it “selling thousands of Gromdars.” There are no more recent public numbers, though, to back up any ongoing claims of success.

So solar is apparently way ahead on that score. Last year, another 370,000 solar systems got installed in the US, mostly on homes. The 2016 additions brought the residential total to almost 1.3 million solar households—more than double the total from just two years earlier.

Another home goes solar (Credit: J. Rogers)

And real-world experience bears out that sense of the relative progress of the two technologies. Think about the people you know who have gone solar, the houses you’ve passed by with solar on their roofs, the stories of people feeling empowered by solar. My kids and I make a game of pointing out every solarized home as we drive around areas where it’s taken hold.

Gromdars? Not so much. My kids almost never shout out from the back seat, “Look, Daddy — a Gromdar!”

Everyone loves solar; Gromdars are… working on it.

While there are some indications of demand for Gromdars, there are also indications that they aren’t a slam dunk, as far as the market is concerned:

Responding to concerns about the marketability of home Gromdars, the tech entrepreneur acknowledged that most new products face resistance from the consuming public when they are first introduced.

Solar, on the other hand, I think it’s fair to say has seldom faced public resistance. That is, people have always loved the idea of a beautifully silent space-age technology calmly churning out electrons for each of us whenever sunlight comes around. It was just a question of affordability.

And now solar is the most popular energy option around: According to the Pew Research Center, 89% of Americans would support getting more of our energy from the sun.

Everybody loves solar! (Source: Pew Research Center 2016)

Pew apparently didn’t include Gromdars in its survey, though we can hope they remedy that oversight.

Solar means jobs; no news on Gromdars.

Employment prospects are another reason that someone might love one or both of these technologies. Alas, no indication of how Gromdar jobs are faring.

Solar, though, is doing amazing things in that department. The 2016 solar jobs census found last year solar employment increased by 25 percent, to 260,000 people. Solar accounted for a stunning one out of every 50 new jobs in 2016. That’s real progress.

And the winner is…

It’s not always a good idea to compare two such interesting technologies head-to-head like this. But if we do (and we have), we find (and have found) that—sorry, Gromdar—solar wins, hands down.

Don’t you, though, put your hands down. Put them up, in celebration, and cheer solar’s progress.

At a time when many key decisions in Washington about our climate and energy future seem like jokes, not serious policy, we’d be fools not to.

Happy April 1.



Photo: Black Rock Solar/CC BY 2.0, Flickr

Scientific Integrity Policies Do Not Make Agencies the Fact Police

UCS Blog - The Equation (text only) -

Recently, the Sierra Club filed a complaint with the EPA Inspector General alleging that EPA Administrator Scott Pruitt violated the agency’s scientific integrity policy by making false statements about climate change science. Reuters is reporting that the IG has referred the complaint to the agency’s scientific integrity official. But the EPA should proceed carefully in deciding whether to consider this as an issue that is subject to the agency’s scientific integrity standards.

Scientific integrity policies exist to prevent political interference in the process by which science informs decision making. They exist to protect the rights of scientists to communicate about their work and to prevent the manipulation and suppression of scientific evidence throughout the policymaking process.

The policies were not, however, created to fact check every statement made by a public official. They were developed in response to overt political interference in science that became common during the George W. Bush administration. Scientists were censored. Official scientific reports were altered by political appointees. Testing processes were changed to suggest that unsafe products were really safe after all. These are the types of actions that are most deserving of scrutiny.

It is tempting to want to punish public officials for lying about established science. But the scientific integrity policies do not serve this function, and for good reason. If scientific integrity officials were expected to become de facto fact police, they would spend all of their time looking at these kinds of allegations, and have little time left over to investigate actions that can have the most significant effects on science-based decision making.

To be clear, what Scott Pruitt said on CNBC was bananas. It’s unacceptable for the head of the Environmental Protection Agency to make such patently false statements. But attempting to punish the administrator under the scientific integrity policy isn’t the right approach, and could even distract public attention and agency investigative resources away from the real damage that the Trump administration is doing to our collective ability to meet the challenge of climate change and protect public health and safety.

Disregarding Science, Trump Administration Trades Kids’ Brains for Dow Profit

UCS Blog - The Equation (text only) -

At the risk of exhausting you with more evidence of the Trump administration’s contempt for science and the public interest, here’s another assault. After years of study and deliberation by scientists at the US Environmental Protection Agency (EPA) and elsewhere, new EPA head Scott Pruitt announced Wednesday night that he would not ban a pesticide that poses a clear risk to children, farm workers, and rural drinking water users.

In doing so, the administration made a 180-degree turn, handing a win to the pesticide’s maker, Dow AgroSciences (a subsidiary of the Dow Chemical Company) and a loss to pretty much everyone else.

An about-face on the science

Let’s be clear, the EPA doesn’t just regulate chemicals willy-nilly. It usually has to be pushed, sometimes hard. And in this case it was. Tom Philpott at Mother Jones has an excellent rundown of the years-long saga surrounding the nerve-damaging organophosphate insecticide chlorpyrifos at the EPA. Under a court order, EPA proposed in November 2015 to effectively ban this pesticide by revoking the agency’s “tolerances” (legal limits allowed in or on food) for the chemical:

At this time, the agency is unable to conclude that the risk from aggregate exposure from the use of chlorpyrifos meets the safety standard of section 408(b)(2) of the Federal Food, Drug, and Cosmetic Act (FFDCA). Accordingly, EPA is proposing to revoke all tolerances for chlorpyrifos.

When the EPA gets that close to banning a pesticide, you can bet the science is solid. So it’s shocking that, under another court-imposed deadline to finalize its decision this month, the agency’s new science-denier-in-chief abruptly backtracked, suggesting in his statement that the science of chlorpyrifos’s harmful effects isn’t settled.

That claim is disingenuous.

Chlorpyrifos poses a clear-cut risk to children, farmworkers, and rural residents

Chlorpyrifos has been studied extensively, and for years. Once the most commonly used pesticide in US homes, it has been increasingly regulated over the last two decades as scientific evidence of its harm has mounted. Almost all residential uses were eliminated in 2000 based on evidence of developmental neurotoxicity—that is, the chemical’s ability to damage the developing brains of fetuses and young children. Since then, many on-farm uses have also been restricted or banned.

But it’s not enough. The pesticide is still used on corn, soybeans, fruit and nut trees, certain vegetables including Brussels sprouts and broccoli, and other crops. And it’s still harming kids and workers.

Last year, researchers studying mothers and children living in the agricultural Salinas Valley of California documented that just living within a kilometer of farm fields where chlorpyrifos and other neurotoxic pesticides were used lowered IQs by more than two points in 7-year-old children, with corresponding impairment in verbal comprehension. Other studies have found that exposure in the womb is associated with changes in brain structure and function. Farm worker exposure is also a concern, as is exposure of rural residents through drinking water.

Which brings us back to the regulatory battle. Last fall, a coalition of environmental, labor, and health organizations petitioned the EPA to ban all remaining uses of chlorpyrifos, citing unacceptable risks to workers. In November, the EPA inched closer to a ban, revising its human health risk assessment and drinking water exposure assessment for chlorpyrifos. The agency summarized its conclusions this way:

This assessment shows dietary and drinking water risks for the current uses of chlorpyrifos. Based on current labeled uses, the revised analysis indicates that expected residues of chlorpyrifos on food crops exceed the safety standard under the Federal Food, Drug, and Cosmetic Act (FFDCA). In addition, the majority of estimated drinking water exposure from currently registered uses, including water exposure from non-food uses, continues to exceed safe levels, even taking into account more refined drinking water exposure. This assessment also shows risks to workers who mix, load and apply chlorpyrifos pesticide products. (emphasis added)

The proposed ban was supported by independent scientists and a coalition of Latino, labor, and health organizations including the United Farm Workers.

Oh yeah, and it was supported by the science and the federal law meant to protect children from toxic pesticides.

EPA is legally required to ban pesticides that threaten health

In 1993, the National Academy of Sciences released a landmark report titled Pesticides in the Diets of Infants and Children. Based on a five-year study, the report recommended major changes in the way EPA regulated pesticides in order to protect children’s health, noting that children are not “little adults.” Three years later, Congress acted on those scientific recommendations, passing the Food Quality Protection Act of 1996 unanimously (yes, I said unanimously, can you imagine?).

This breakthrough law mandated that the EPA go above and beyond what it had ever done before in considering the developmental susceptibility of infants and children, and their dietary habits, when making regulatory decisions about pesticides. The law built in a 10-fold “safety factor” to be sure kids would be protected.

Of course, children are only protected if the EPA follows the law and the science. And Dow kept the pressure on to ensure they wouldn’t. For now, Pruitt’s announcement represents “final agency action” on chlorpyrifos, and the EPA won’t be required to revisit the question of the pesticide’s safety until 2022. (Sorry, kids.)

How else might the Trump administration undermine science and children’s health?

This latest decision, along with the proposed slashing of EPA’s budget, leaves me wondering just how far the new administration will go in ignoring science and undoing children’s health protections. While the EPA budget cuts are getting a lot of attention, some health scientists are worried as well about the fate of the National Institute for Environmental Health Sciences (NIEHS) as well. In partnership with EPA, NIEHS operates a national network of research centers studying children’s environmental health and educating the public about risks. If funding for those centers is also cut, who will look out for the health of children?

I spent years back in the late 90s and early aughts pressing Congress and the EPA to tighten the rules on toxic chemicals. We’re still not where we need to be, but we’ve made progress. And now it looks like that progress is very much at risk.


Richard Leeming/Flickr

Survey Shows Abundant Snow, But Will it Stick?

UCS Blog - The Equation (text only) -

Today’s snow survey confirms abundant snow in the Sierra Nevada, an extreme turn from five years of drought. With climate change contributing to warmer winters in the Sierra Nevada, that snow may not stay put for long – an early snowmelt will cause flooding and require reservoirs to spill excess water that could threaten safety of California dams in the weeks to come.

As a consequence, Los Angeles’ Mayor Eric Garcetti recently declared a state of emergency for the small town of Owens Valley located in the foothills of the Sierra Nevada, nearly 300 miles from LA, where much of the city’s water supply originates. The mayor is worried that the melting mountain snowpack would flood the Owens Valley and overwhelm the LA Aqueduct, causing up to $500 million in damage.

Essentially, we have a timing problem. Our increasingly outdated water system relies on high elevation dams designed to fill slowly with snowmelt over the spring and summer, delivering water in the summer and fall when water demands are highest.

Today’s water system is experiencing earlier snowmelt that is filling reservoirs and forcing excess water to be spilled, leading to the type of flooding and infrastructure damage that we witnessed in Oroville and San Jose over the past two months.

This winter may be a sign of what’s to come. Dr. Alex Hall of UCLA’s newest climate science research concludes that the peak snowmelt, which has historically occurred in April, will shift to January by the end of this century. Our current system of dams, reservoirs and levees is not prepared to handle an extreme shift like that and will fail to deliver anywhere near the quantity of water it does today.

That’s why we must develop a more climate-resilient water system – one that makes more effective use of groundwater storage — a critical strategy to adapting to the loss of snowpack and the shift in the timing of water supply.

California’s underground aquifers have three times the storage capacity as all of the state’s above-ground reservoirs. With global warming increasingly causing precipitation to fall as rain rather than snow, it’s time to stop looking up to the mountains for our water supply and start paying attention to the larger storage capacity underneath our feet.

President Trump’s Executive Orders Promise Energy Independence, But Deliver Trouble

UCS Blog - The Equation (text only) -

As President Trump and the Republicans on Capitol Hill are quickly learning, developing real public policy is a lot more complicated than repeating popular slogans to excited fans on the campaign trail.  And this holds true not just for health care, but for taxes, energy, environmental and transportation policy.  Earlier this week President Trump signed an executive order, instructing agency heads to take several steps toward “promoting energy independence and economic growth.”

Energy independence and economic growth sound like good goals—just like everyone wants health care insurance with better coverage, more competition, and lower premiums.  But since the campaign is over and the work of actual policy-making is getting underway, let’s consider how the measures proposed in this executive order and other recent actions stack up against the promises.

Looking for energy independence in the wrong places

My colleague Rachel Cleetus reviewed the broad implications for the planet of President Trump’s All-Out Attack on Climate Policy. I’ll focus on the transportation specific implications.  President Trump’s executive order talks about “energy independence,” but, in reality, the Clean Power Plan that the President’s order seeks to repeal focuses on electricity generation.

Virtually none of the resources used to make electricity — coal, natural gas, nuclear and renewables — are imported.  The United States is a net exporter of coal, and imports a trivial share of its natural gas, mostly from Canada.  Wind and solar energy, meanwhile, are clean, non-depleting domestic resources.  That means that electricity is about 99 percent domestically produced.  The vast majority of our energy imports are oil and the Clean Power Plan has nothing to do with oil.  Eliminating the Clean Power Plan will have no impact on energy independence.

Historical data and projections from the Energy Information Administration’s Annual Energy Outlook 2017 show that the US does not import coal, and imports very little natural gas. Oil has been and is expected to remain the main energy import.

Oil Imports = Oil Use – Oil Production

It’s simple arithmetic, but the Trump administration seems to have forgotten that the amount of energy (mostly oil) that the United States does import depends on how much oil Americans use, less the amount the nation produces.  So, we can reduce imports by either using less oil or by producing more.  Of the two options, using less oil solves a lot of other problems; producing more causes more problems than it solves.

Cutting oil use through efficiency is the smart path to energy security

Cutting demand for oil is a long-term process, because we all have places to go and, on any given day, we don’t have an unlimited set of choices for transportation.  Over the last decade, the United States has made major progress improving the efficiency of the cars we drive.  A decade ago an average new car got about 20 miles per gallon (mpg), and that figure is 25 mpg today.  We are on the road to new cars averaging 35 mpg or more a decade from now.  This means that while a new car used 600 gallons of gas a year in 2005, a new car is using 480 gallons to drive the same distance today and this will fall to less than 350 gallons in 2025.

With similar improvements in the efficiency of big rigs, planes and other vehicles, this adds up to substantial oil savings as the current inefficient fleet is replaced by more efficient cars, trucks and planes.  Efficiency improvements don’t just help reduce oil use, they save drivers billions of dollars and reduce global warming pollution.

But that’s only if our efficiency programs are fully implemented. Instead, the Trump administration has signaled its intention to weaken our federal fuel efficiency and vehicle emission program. Weakening these standards would cost drivers more money, increase our consumption of oil and hurt energy independence, as well as increasing global warming pollution.

Every additional electric vehicle cuts oil use, energy imports, and slows climate change

Replacing a typical 2015 25 mpg car with a 35 mpg car in 2025 saves 130 gallons a year.  But replacing it with a plug-in electric vehicle cuts US oil use to zero.  And since electricity is 99 percent domestic, the impact on energy imports is dramatic.  In addition, electric vehicles are less polluting than gasoline powered cars, even when electricity generation is included, and are getting steadily cleaner over time. The smartest path to energy security, as well as a low carbon future, is to electrify transportation as quickly as possible.

US oil production increased rapidly in the last decade, so what problem are we trying to solve?

The administration claims that Obama-era policies have choked off the oil industry, but this does not square with the facts.  Domestic oil production grew 80 percent between 2005 and 2015, almost entirely because of expanded production of so-called tight oil from fracking, which now accounts for more than half of US oil production.  US oil production fell in 2016 because of low oil prices, and future domestic oil production depends mainly on the global price of oil, rather than on regulations.  Indeed oil companies’ financial statements make it clear that recent and proposed environmental regulations have “no material impact” on their business. What does matter is global energy prices.

The Energy Information Administration projects that if oil prices rise enough to bring gasoline retail prices to $5 per gallon, the U.S. may indeed become a net oil exporter as consumption falls and production rises.  But if oil prices are low, imports will rise.  If you are worried about Americans struggling to pay their fuel bills, investments in efficiency will do much more to protect them from volatile oil prices than will weakening regulations that protect the public from oil industry pollution.  And while many factors influence global oil prices, cutting demand for oil by accelerating the progress of efficiency and electrification will certainly help push oil prices down and protect consumers.

The administration’s proposals have nothing to do with responsible energy production

The term “energy independence” is never defined in the executive order, but the emptiness and cynicism with which it is used is clear.  This order, together with other recent energy related measures the administration is advancing, allow oil and gas producers to waste natural gas instead of collecting it, to weaken fuel efficiency standards, and permit construction of a pipeline that will encourage expansion of some of the dirtiest crude in the world.  These measures will harm many people and set back efforts to reduce global warming pollution.  They primarily remove energy producers’ and automakers’ obligations to consider the consequences of their actions on climate change, and they will not reduce US energy imports.

Real energy security means energy that does not harm our security (or health or economy)

Energy in many different forms is essential to our lives, but just because energy is important does not imply that energy companies should not be responsible to minimize the harms caused by the production and use of their products.  Climate change poses a profound threat to our health, prosperity and security, so meaningful energy security must include a path to climate stabilization.  Transportation recently surpassed electricity generation as the largest source of CO2 emissions in the United States, and these emissions come overwhelmingly from oil.  Cutting transportation emissions means using less oil, through improved efficiency and rapid electrification of transportation.  Transportation fuel producers also have an important role to play, and oil companies no less than biofuels or electricity producers must reduce the pollution from their operations.

Companies and countries that lead the way towards a low carbon future will have a competitive advantage as the world inevitably moves to grapple with climate change.  The winners will be vehicle manufacturers that produce the most efficient vehicles and lead the way towards electrification, and energy companies that avoid the most polluting fossil fuels and reduce avoidable emissions from their operations.  Smart policies will help American companies lead the way, but the short sighted regulatory rollback the Trump administration is pursuing will leave American industry uncompetitive.  Responsible industries understand that protecting their customers and the communities in which they operate is key to maintaining their social license. While the Trump administration is actively facilitating irresponsible behavior, the world is watching.  The future will ultimately and inevitably favor companies that live up to their responsibilities.

EPA Chief Scott Pruitt Ignores the Science on Pesticides, Puts Children at Risk

UCS Blog - The Equation (text only) -

The appointment of Scott Pruitt as EPA administrator in the Trump administration worried a lot of people like me because of his long history of attacking the work of the very agency he is now leading. It has only been a few weeks, but one pattern is emerging. Mr Pruitt will be misstating the scientific evidence while overstating the gaps in the work of the agency’s scientists as an excuse for inaction.

Pruitt refuses to regulate pesticides that impact child development

Yesterday, Mr. Pruitt denied a long-standing petition by public interest groups to restrict the use of pesticides containing chlorpyrifos, a chemical whose health impacts include long-term, irreversible effects on children’s brain development.

Pruitt’s action overturns the decision made by the EPA last year to protect children from developmental delays caused by exposure—in food and in water—to residues of this commonly used pesticide. The analysis of risk to children by agency and academic scientists has been reviewed and re-reviewed and is supported by a wide range of scientists from academia and research institutions.

So what was Administrator Pruitt’s conclusion in one of his first official actions? He uses telling phrases in his press release:

EPA needs to provide “regulatory certainty,” which apparently means do nothing.

And the EPA must return to using “science in decision-making – rather than predetermined results.” Given that the science in this case was well reviewed and that the petition was under consideration for years, it seems the only result that was predetermined is that Pruitt would side with industry groups that have consistently resisted regulations to restrict the use of this pesticide.

The press release incorrectly asserts that there are serious scientific concerns and the studies on risks were misapplied. But that is not what the scientists said, so apparently Mr Pruitt is overruling the evidence in making his decision. Maybe “sound science” is Pruitt pseudo-science.

Play it again Scott

It was only a few weeks ago that Administrator Pruitt, speaking on the role of CO2 emissions in changing our climate. had this to say:

“I think that measuring with precision human activity on the climate is something very challenging to do and there’s tremendous disagreement about the degree of impact, so no, I would not agree that it’s a primary contributor to the global warming that we see. But we don’t know that yet, we need to continue the debate we need to continue the review and analysis.”

Once again, Mr. Pruitt attempts to sow seeds of doubt on the scientific consensus of human-caused climate change, just like with pesticides. But as my colleague Brenda Ekwurzel points out, he is just wrong on the science.

Misstating the scientific evidence is just that, falsifying the facts. And it is not an excuse for inaction.

Mr. Pruitt, your job—by law—is to protect the public health and safety. Please do it.

Climate Enters Unchartered Territory—But We Can Prepare for the Risks Global Warming Brings

UCS Blog - The Equation (text only) -

The World Meteorological Organization recently released its State of The Global Climate for 2016. There was a wealth of information in it: a new temperature record (approximately 1.1 °C above the pre-industrial period and 0.06 °C above the previous record set in 2015), CO2 new highs (400.0 ± 0.1 ppm in the atmosphere at the end of 2015), unprecedented global sea-ice extent (more than 4 million km2 below average in November), and global sea level rise in early 2016 values making new records (with plenty of coral bleaching and acidification).

To top it all, news were also released that ice in both the Arctic and Antarctica have reached record low extents (for winter maximum in the Arctic and summer minimum in Antarctica).


State of the Climate 2016, WMO 2017.

“Truly Uncharted Territory”

All of those facts are eye-catching and concerning enough, but they were not the headline-makers. That honor went to the line “Truly Uncharted Territory”. It describes perfectly the situation – we have no exact grasp of all the potential consequences because we have never been there. Yes, we do know that the temperatures will keep going up, but that does not mean we know all of the effects the higher temperatures can have – . But we could take a few guesses – and we’d probably be right.

Among the more intuitive consequences: more extensive and longer-lasting wildfires (check). More extensive and longer lasting droughts (check). More frequent and heavier downpours (check). More severe floods (check). Among the not-so-intuitive: An increase in winter storms frequency and intensity (check). An increase in the intensity, frequency and duration of Atlantic hurricanes (check). An increase in health problems related to the increase in ozone as a consequence of warming (check). There are more, but you get the idea.

We must be prepared for a range of climate-induced risks

So how can we prepare for possible outcomes of global warming? The answer is actually pretty simple: by preparing for a range of risks. Just as home insurance covers a variety of risks (but, for the record, does not cover flood), so should climate preparedness. And just as in a home we can do things to reduce risk (sprinklers, handrails, door locks), so can we do in the case of global warming.

Disaster risk reduction is a concept that has been around for quite a while. Risk reduction and management comes in many ways, such as the update to the federal flood risk management standard, part of Executive Order 13690 signed by former President Obama in 2015, which requires federal agencies to ensure that public infrastructure (including public housing, hospitals, and water treatment plants) is more resilient to flooding.

Why prepare? It makes sound fiscal sense, and saves taxpayers in the long run. The National Institute for Building Sciences in 2005 found that, on average, every dollar invested in hazard mitigation results in $4 saved in recovery costs. And from 2005 to 2014, the federal government spent $277.6 billion on disaster assistance, while FEMA designated less than $600 million toward its primary pre-disaster mitigation program. Increasing preparedness investments before disaster strikes is just plain smart – it saves not only money but disruption of the economy and human lives.

Risk reduction requires science, policies, and financial resources

Budget cuts proposed by the Trump administration can seriously undermine disaster mitigation. Not only does preparedness require financial investments, the planning must be based on solid science. Agencies like FEMA, NOAA, EPA, and HUD – all at risk of diminished budgets – are the ones providing the science and evidence-based decisions that guide disaster risk reduction programs and policies. Without the myriad scientific information streaming regularly from NOAA and EPA science programs and monitoring, FEMA and HUD can find themselves at a loss when deciding next steps. The American people will pay the price – twice: in their pockets and in their everyday lives.

Last but not least, the rolling back of climate safeguards through Executive Order will have significant impacts not only on disaster preparedness and risk reduction, but also on the necessary global warming mitigation efforts. The logic is straightforward: if we do not reduce emissions, warming rate will be higher, and with more warming, comes a broader range of possible risks, which will require more resources and more preparedness. And one thing we do not need is to have to figure out even more protections – we are struggling enough as it is.

We can effect change – one step at a time!

No matter how discouraging the news is, we can keep the fight everywhere we are, everywhere we go. Everyone has a stake in this fight, scientists and non-scientists alike. We can change things, for the people are truly powerful – and they are learning about their power. A movement has started that will not be stopped, and will hold our nation’s leaders accountable.

Please join us on April 29 for the People’s Climate Movement in Washington DC. You too can – and should – be part of the movement.


President Trump’s New Anti-Climate Executive Order Threatens Our National Security

UCS Blog - The Equation (text only) -

Yesterday, President Trump signed the Presidential Executive Order on “Promoting Energy Independence and Economic Growth” which, as my colleague states, represents an all-out attack on climate solutions.

While policy watchers had been expecting the Administration’s attack on climate policy for some time, what many of us are still amazed at is that President Trump’s anti-climate science and policy flies in the face of the American people, who on average believe global warming is happening (70%), is caused by humans (53%), is harming people (51%), and will harm future generations (70%).

Even more amazing is the Administration’s failure to understand the climate connection when it comes to our national security.

The Military has connected the climate and security dots











During his recent visit on March 2 to Newport News Shipbuilding in Hampton Roads, President Trump was flanked by Defense Secretary Mattis and two Republican congressmen Reps. Scott Taylor of Virginia Beach and Rob Wittman.

Aboard the USS Gerald R. Ford he “lauded the Navy and the shipyard’s workforce” and underscored that he will be strong on defense (pledging to increase defense spending).  In a region that is a sea level rise hotspot, where municipalities along with military bases are taking steps to cope with rising seas, the obvious, glaring omission from his speech was climate change.

 While the omission was jaw-dropping, it’s not surprising, as he continues to fail to connect the climate and security dots. He has called climate change a hoax and as many people have pointed out, even President Trump’s own “Winter White House” aka Mar-a-lago, located at what can be argued as a hot spot of sea level rise, hasn’t helped him make this connection. And finally, not even the long history of the military recognizing climate change as a threat nor President’s Trump’s own Secretary of Defense, James Mattis, have helped.  In Mattis’ unpublished testimony, he swiftly connected the dots on climate change & national security stating that climate change is a national security issue, it requires a whole of government approach, and the DoD needs resources to adequately prepare for these changes.

President Trump’s ‘Energy Independence’ Executive Order comes in stark contrast to the military’s record on climate change and binds the hands of the Department of Defense (DoD) in ensuring our nation’s readiness in the face of climate change. The Executive order revokes the 2016 memorandum on Climate Change and National Security which established an agency-wide working group to set priorities and recommendations on addressing climate change impacts to our national security.

Alice Hill, who served in the Obama administration as the Special Assistant to the President and Senior Director for Resilience Policy for the National Security Council (and who led the working group) underscores that in addition to the Department of Defense, the Department of Homeland Security, the Department of State, and the National Intelligence Council (NIC) all recognize that climate change is a threat and that we are already feeling the impacts.

The NIC’s September 2016 report, entitled Implications for US National Security of Anticipated Climate Change, finds six key pathways in which climate change will threaten national security:


ONE:  threatens the stability of countries.

TWO:  heightens social and political tensions.

THREE:  adversely effects food prices and availability.

FOUR:  increases risks to human health.

FIVE:  negatively impacts investments and economic competitiveness.

SIX:  increases risks of abrupt climate change or “climate discontinuities and secondary surprises”.


Indeed, this is all too true and we have an abundance of evidence on these six pathways:

  • On the stability of countries: the risk of armed-conflict outbreak is increasing and that social and political tensions are fueling armed conflicts around the world particularly prone are North and Central Africa and Central Asia (also see G20 Policy Brief on Climate and Displacement)
  • On heightened social and political tensions: we know that climate change is swelling the numbers of displaced persons (in fact one person every second is displaced by climate change).
  • On food prices and availability:  studies show that climate change is already affecting food prices and scarcity, take for instance northeastern Syria from 2007 to 2010 when the worst drought on record caused crop failures and mass migration all of which were contributing factors that led to the civil war in 2011 (see this link for more evidence on climate change causing Syrian instability).
  • On risks to human health: the American Public Health Association has dubbed 2017 as the year of climate change and health and the Medical Society Consortium recently released their “MEDICAL ALERT! Climate Change Is Harming Our Health” report indicating among many other facts that children bear a greater burden of climate-associated health impacts.
  • On investments and economic competitiveness: Schroders climate change survey found that climate change represents a significant threat to the global economy in the current century and will have an inflationary impact on the world economy.
  • On abrupt climate change: The Journal of Atmospheric Chemistry and Physics published a paper by James Hansen and many others, who warn that the current climate mitigation targets don’t go far enough and will actually lead to a more dangerous climate with stronger storms, rising sea levels among other impacts (click here for a “101” on abrupt climate change).



Our recent study, US Military on the Frontlines of Rising Seas, drives home how our own military installations are at risk to being mostly underwater in the near future due to sea level rise.



Moreover, the Center for Climate and Security’s Military Expert Panel Report: Sea Level Rise and the U.S. Military’s Mission  finds that “sea level rise risks to coastal military installations will present serious risks to military readiness, operations and strategy.”


While President Trump has hit the “Ctrl-Alt-Delete” on climate science and climate policy, he can’t delete the stark evidence of the impacts of climate change that are happening now (more and longer-lasting droughts and wildfires, more frequent and heavier downpours, more severe floods, to name a few) and that billion-dollar disasters are increasing across the nation and worldwide.

Nor can President Trump press the Ctrl-Alt-Delete buttons on the hard-wired role of the military to ensure its readiness to any threat, including climate change.


The Union of Concerned Scientists Flickr National Intelligence Counci Center for Climate and Security NOAA


Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs