Combined UCS Blogs

Protect the Science, Protect the Species

UCS Blog - The Equation (text only) -

As we face irreversible destruction of species and their habitats due to threats from habitat loss and fragmentation, overharvesting, pollution, climate change, and invasive species, lawmakers indicate they intend to attack the Endangered Species Act again. Under the current administration, we’ve already witnessed the introduction of several pieces of legislation intended to weaken the Endangered Species Act or specific species protections. Most recently, Senator Barrasso (R-WY), chair of the Senate Committee on Environment and Public Works, announced interest in introducing legislation sometime this summer to overhaul the Act (here and here), despite the ESA’s history of overwhelming support from voters. These potential modifications would mean shifting the authority of implementing the Endangered Species Act from scientists and wildlife managers to politicians.

Science is a constitutive element of the Endangered Species Act, the emergency care program for wildlife. It is the foundation for listing and delisting threatened and endangered species, developing recovery plans for the continued survival of listed species, and taking preventative conservation efforts. This is both a boon and a curse. Since the Endangered Species Act relies on the best available science to make conservation decisions, it is highly successful—over 99% of the species protected under the Act have dodged extinction—yet this reliance on science also makes the law highly susceptible to outside interference from political interests.

Here I am on a nesting beach in Barbuda, monitoring critically endangered hawksbill sea turtles (see above), one of over a thousand species currently listed under the ESA.

The Endangered Species Act has withstood a barrage of politically motivated attacks over the years, from hidden policy riders to blatant editing of scientific content in federal documents.  The notoriety of the sage grouse, for example, comes more as a direct result of it being one of the most politically contentious species listed under the ESA than from its ostentatious courting rituals. The sage grouse issue illustrates what can happen when decisions to protect a species prioritize politics.

The implications of attacks on the science-based Endangered Species Act reflect broader attacks on science in general. Science should have priority influence on our policy decisions; otherwise regulated industries and politics will decide critical aspects of our everyday lives—like the safety and quality of our food, air, and water, and whether or not our nation’s biodiversity is protected. As scientists, we must continue to advance the role of science in public policy as a whole, and ensure that public health, worker safety, and environmental protections rely on the best independent scientific and technical information available.

My generation has been accused of ruining everything from napkins to handshakes. But we should recognize that we have a responsibility to protect imperiled species from permanent extinction so that future generations can experience animals like the bald eagle in the wild. Ensuring that this responsibility is informed by the best available science provided by biologists and other conservation experts is critical. That’s why as a scientific community, we need to make certain the decisions to protect wildlife at risk of extinction are grounded in science. Scientists, not Congress, should be informing decisions about which species deserve protection under the Endangered Species Act. We don’t need to “fix” something that already works. Please join me in urging Congress not to support any legislation to rewrite or modify the Endangered Species Act—our most successful conservation law.

PS If you need additional motivation to sign the letter, just look at this pair of gray wolf pups! Why would someone be against protecting endangered species?

Drowning in a Sea of Sufficient Ozone Research: An Open Letter to EPA Administrator Scott Pruitt

UCS Blog - The Equation (text only) -

Dear Administrator Pruitt,

When you decided this week to delay the 2015 ozone rule by one year, citing “insufficient information,” did you think about the science of ground-level ozone? Did you look at the data showing that ozone pollution is widespread across the country? And importantly, did you look at the detrimental health impacts that ozone pollution has for Americans?

As the law requires, the ozone standard must provide an adequate margin of safety for the most vulnerable populations—including the elderly, children, and those with lung diseases. I know that you are familiar with—actually hostile to—the ozone rule and its basis in the Clean Air Act. In fact, you’ve spent years fighting the (strong) legal and scientific basis for ozone protections and other environmental safeguards.

I’m sure you remember suing the EPA—alongside fossil fuel industry co-parties who gave to your political action committees—over the ozone rule, a challenge you have said was “based, in part, on concerns that EPA has not adequately assessed the available science.” You might even remember vowing, back when the rule was recently proposed, to “challenge the EPA’s misguided and unlawful overreach” in part because the “EPA has not yet articulated how the rule will further improve public health.”

So Administrator Pruitt, I have to ask; what are your definitions of “insufficient” and “adequate” and “articulation”? Because I have looked at the science, and I can tell you that we have more than sufficient information to act on it, contrary to what you claim.

As an air quality scientist, I’ve studied the data on ozone and health. I submitted my own opinion on the ozone rule during its official comment period. I can assure you we are standing on solid ground when it comes to the ozone rule. For your quick reference, here’s a rundown of just how incredibly sufficient the science is on the public health threat of ground-level ozone pollution.

1,251 pages of scientific assessment

As part of the update to the ozone standard, EPA conducts the Integrated Science Assessment (ISA). The 1,251-page document is produced by EPA scientists and surveys the current scientific literature on ozone (including one of my own papers). The peer-reviewed document finds several “causal” and “likely causal” relationships between ozone pollution and health effects. Of note, the report identifies “a very large amount of evidence spanning several decades [that] supports a relationship between exposure to O3 and a broad range of respiratory effects.” In addition, the report finds associations between ozone and short-term cardiovascular effects and total mortality, along with long-term respiratory effects.

Science advisers agree

As I’ve written before, the Clean Air Science Advisory Committee (CASAC), or the group of external independent subject-matter experts that EPA uses to provide scientific recommendations for the standard, came to the conclusion that the standard should be tightened. In its letter to the EPA administrator, the science advisors recommended a range of 60-70 ppb for the standard. In addition, the committee concluded that although 70 ppb was included in its recommended range, such a standard would not provide an “adequate margin of safety,” as the Clean Air Act mandates. The committee went on to note that with a 70-ppb standard there is “substantial scientific evidence of adverse effects … including decrease in lung function, increase in respiratory symptoms, and increase in airway inflammation.”

More scientists agree

The Ozone Review Panel was an additional set of external independent experts that works with CASAC to discuss the state of the science and review the ISA. These experts were brought in to provide additional expertise specific to ozone. This panel largely concurred with lowering the standard to something in the 60 to 70 ppb range as well, noting that a standard below 70 ppb would be more protective of public health.

None of this is new

It’s worth reiterating that the above voices recommending a lower standard are joining those from many years previous. In fact, CASAC first proposed that the ozone standard be in the 60 to 70 ppb range back in 2007. States have known—and have been preparing for a tighter ozone standard—for a very long time. Despite your suggestion otherwise, states have had ample opportunity to prepare for this standard that was finalized nearly two years ago.

The bottom line is that the law requires setting the ozone standard based on science and science alone. The administration must set a standard that is protective of public health with an adequate margin of safety and cannot legally consider economic arguments.

Do you feel up to this task, Administrator Pruitt? Are you able to do your job of protecting the public from ozone threats? And importantly, can you carry out the mission of the EPA of protecting the public health and environment? My colleague Andrew Rosenberg raised concerns before you were even appointed and this decision (among others) proves those concerns were well placed.

If you’d like more information, you can read more of my posts on the EPA update to the ozone standard herehere, and here. And I know many air quality scientists who would be happy to tell you more about what they know. I assure you, Administrator Pruitt, we are drowning in a sea of sufficient science on ozone, if only you’ll listen to the scientists.

Sincerely,

Gretchen Goldman

 

Wind Keeps Creating Jobs, Even as We Pull Out of Paris

UCS Blog - The Equation (text only) -

President Trump announced last week that he was pulling the United States out of the Paris Climate Agreement because, he said, it would impose “draconian financial and economic burdens” on the US. This classic fossil fuel industry rhetoric of pitting the economy against the environment (in this case the climate and future of our planet) has been proven time and time again to be a false choice. The latest, impressive US wind industry results show that more clearly than ever.

Numerous cost-effective climate solutions are available that can create jobs and reduce emissions at the same time to help meet the Paris Agreement. In fact, solutions like improving the energy efficiency of our homes, offices, factories and cars, and investing in solar and wind power can take us most of the way there and actually save consumers money.

When you include the public health and environmental benefits of clean energy, the savings and economic benefits are even larger.

Wind power is working for America

For wind power in particular, recent data from the American Wind Energy Association’s (AWEA) 2016 Annual Market Report show how wind is creating high quality jobs and important economic benefits to rural areas, while reducing emissions at the same time.

US wind capacity has more than doubled since 2010, accounting for nearly one-third of all new electric generating capacity since 2007. Wind power surpassed hydropower in 2016 to become the number one source of renewable electric generating capacity in the country. The wind industry installed more than 8,200 megawatts (MW) of new capacity in 2016, bringing the total US installed capacity to 82,000 MW. Wind power generated 5.5 percent of total US electricity generation in 2016, the equivalent of meeting the entire electricity needs of 24 million average American homes.

Wind industry jobs are growing fast. The US wind industry added nearly 15,000 new jobs in 2016, reaching a total of 102,500 full-time equivalent jobs in all 50 states, up from 50,500 jobs in 2013. Wind power technician is the fasting growing job in the US, according to the Bureau of Labor Statistics. Texas, the national leader in installed wind capacity, also has the most wind-related jobs with more than 22,000, followed by Iowa, Oklahoma, Colorado, and Kansas, each having 5,000 to 9,000 wind jobs (see map).

Source: AWEA annual market report, year-ending 2016.

Domestic wind manufacturing is expanding. Wind power supports 25,000 US manufacturing jobs at more than 500 facilities located in 43 states. US wind manufacturing increased 17 percent in 2016, with 3 new factories opening and 5 existing factories expanding production. Ohio is the leading state for wind manufacturing with more than 60 facilities, followed by Texas (40), Illinois (35), North Carolina (27), Michigan, Pennsylvania and Wisconsin (26 each).

While manufacturing jobs are concentrated in the Rust Belt, Colorado, Iowa, and California are also national leaders manufacturing major wind turbine components, and the Southeast is a major wind manufacturing hub with more than 100 factories. US facilities produced 50-85 percent of the major wind turbine components installed in the United States in 2015, up from 20 percent in 2007, according to Lawrence Berkeley National Lab (LBNL).

Investing in rural communities. The wind industry invested $14.1 billion in the US economy in 2016, and $143 billion over the past decade, with most of this flowing to rural areas where the wind projects are located. Wind energy also provided an estimated $245 million annually in lease payments to farmers, ranchers and other landowners in 2016, with more than $175 million occurring in low-income counties. AWEA estimates that 71 percent of all wind projects installed through 2016 are located in low-income rural counties.

And now for the kicker…

Wind power is providing major economic benefits to President Trump’s base. AWEA estimates that 88 percent of the wind power added in 2016 was built in states that voted for President Trump. In addition, 86 percent of total installed wind capacity in the US and 60 percent of wind-related manufacturing facilities are located in Republican districts.

Source: AWEA annual market report, year-ending 2016.

Wind power is affordable for consumers. The cost of wind power has fallen 66 percent since 2009, making renewable energy more affordable to utilities and consumers. A 2016 NREL and LBNL analysis quantifying the benefits of increasing renewable energy use to meet existing state renewable standards found that the health and environmental benefits from reducing carbon emissions and other air pollutants were about three times higher than the cost of the production tax credit (PTC).

Wind power is reducing emissions: AWEA estimates that existing wind projects avoided nearly 159 million metric tons of carbon dioxide (CO2) emissions in 2016, equivalent to 9 percent of total power sector emissions, as well as 393 pounds of SO2 and 243 million pounds of NOx emissions.

More wind development, jobs, and emission reductions are on the way

And there’s lots more to come. Wind development will continue over the next few years due to the recent 5-year extension of the federal tax credits, state renewable electricity standards, and continued cost reductions. Studies by NREL, EIA, and UCS project that the tax credit extensions will drive 29,000 to 59,000 MW of additional wind capacity in the US by 2020.

Similarly, a study by Navigant Consulting projected 35,000 MW of new wind capacity will be installed in the US between 2017 and 2020, increasing total wind-related jobs to 248,000 by 2020 and injecting $85 billion into the US economy. They also found that each wind turbine creates 44 years of full-time employment over its lifetime.

When combined with additional deployment of solar, NREL found that the federal tax credit extension would result in a cumulative net reduction of 540 to 1,420 million metric tons (MMT) of CO2 emissions between 2016 and 2030, depending on projected natural gas prices.

Studies by EPA and UCS also show that the Clean Power Plan (CPP)—a key policy for achieving the US Paris commitments–would continue to drive wind and solar development and emission reductions through 2030, with the public health and environmental benefits greatly exceeding the costs.

Backing away from Paris and the CPP could actually hurt the US economy

All these amazing facts show that President Trump is wrong to ignore the economic benefits of wind and other clean energy options for the US, and that’s a real shame.

Market forces and continued cost reductions will drive more clean energy development in the US in the near-term. However, countries like China and India are also making significant investments in renewable energy as a key strategy for reducing emissions under the Paris Agreement.

For America to maintain its leadership position in the global clean energy race, we need strong long-term climate and clean energy policies like the Paris Agreement and the Clean Power Plan. Our country will be stronger for it, not weaker.

Increase in Cancer Risk for Japanese Workers Accidentally Exposed to Plutonium

UCS Blog - All Things Nuclear (text only) -

According to news reports, five workers were accidentally exposed to high levels of radiation at the Oarai nuclear research and development center in Tokai-mura, Japan on June 6th. The Japan Atomic Energy Agency, the operator of the facility, reported that five workers inhaled plutonium and americium that was released from a storage container that the workers had opened. The radioactive materials were contained in two plastic bags, but they had apparently ripped.

We wish to express our sympathy for the victims of this accident.

This incident is a reminder of the extremely hazardous nature of these materials, especially when they are inhaled, and illustrates why they require such stringent procedures when they are stored and processed.

According to the earliest reports, it was estimated that one worker had inhaled 22,000 becquerels (Bq) of plutonium-239, and 220 Bq of americium-241. (One becquerel of a radioactive substance undergoes one radioactive decay per second.) The others inhaled between 2,200 and 14,000 Bq of plutonium-239 and quantities of americium-241 similar to that of the first worker.

More recent reports have stated that the amount of plutonium inhaled by the most highly exposed worker is now estimated to be 360,000 Bq, and that the 22,000 Bq measurement in the lungs was made 10 hours after the event occurred. Apparently, the plutonium that remains in the body decreases rapidly during the first hours after exposure, as a fraction of the quantity initially inhaled is expelled through respiration. But there are large uncertainties.

The mass equivalent of 360,000 Bq of Pu-239 is about 150 micrograms. It is commonly heard that plutonium is so radiotoxic that inhaling only one microgram will cause cancer with essentially one hundred percent certainty. This is not far off the mark for certain isotopes of plutonium, like Pu-238, but Pu-239 decays more slowly, so it is less toxic per gram.  The actual level of harm also depends on a number of other factors. Estimating the health impacts of these exposures in the absence of more information is tricky, because those impacts depend on the exact composition of the radioactive materials, their chemical forms, and the sizes of the particles that were inhaled. Smaller particles become more deeply lodged in the lungs and are harder to clear by coughing. And more soluble compounds will dissolve more readily in the bloodstream and be transported from the lungs to other organs, resulting in exposure of more of the body to radiation. However, it is possible to make a rough estimate.

Using Department of Energy data, the inhalation of 360,000 Bq of Pu-239 would result in a whole-body radiation dose to an average adult over a 50-year period between 580 rem and nearly 4300 rem, depending on the solubility of the compounds inhaled. The material was most likely an oxide, which is relatively insoluble, corresponding to the lower bound of the estimate. But without further information on the material form, the best estimate would be around 1800 rem.

What is the health impact of such a dose? For isotopes such as plutonium-239 or americium-241, which emit relatively large, heavy charged particles known as alpha particles, there is a high likelihood that a dose of around 1000 rem will cause a fatal cancer. This is well below the radiation dose that the most highly exposed worker will receive over a 50-year period. This shows how costly a mistake can be when working with plutonium.

The workers are receiving chelation therapy to try to remove some plutonium from their bloodstream. However, the effectiveness of this therapy is limited at best, especially for insoluble forms, like oxides, that tend to be retained in the lungs.

The workers were exposed when they opened up an old storage can that held materials related to production of fuel from fast reactors. The plutonium facilities at Tokai-mura have been used to produce plutonium-uranium mixed-oxide (MOX) fuel for experimental test reactors, including the Joyo fast reactor, as well as the now-shutdown Monju fast reactor. Americium-241 was present as the result of the decay of the isotope plutonium-241.

I had the opportunity to tour some of these facilities about twenty years ago. MOX fuel fabrication at these facilities was primarily done in gloveboxes through manual means, and we were able to stand next to gloveboxes containing MOX pellets. The gloveboxes represented the only barrier between us and the plutonium they contained. In light of the incident this week, that is a sobering memory.

Federal Science Advisory Boards Under Threat: Why Scientists Should Get Involved

UCS Blog - The Equation (text only) -

Serving on a science advisory board for a federal agency is an interesting and in many ways rewarding experience for a scientist. I have been on a research advisory board for the Navy, advisory boards for assessing the impacts of climate change, and for the National Academy of Sciences as well on several international boards. I always find the work challenging, I learn a lot and I feel like I am making a real contribution to both science and policy-making. So, it is an honor to serve even when providing advice on contentious topics.

In many cases, these positions are uncompensated, except for travel costs, and are “extra duties” for busy scientists. Four to six meetings a year are common.  But since we all care deeply about our work and have gone through years of training and dedicated involvement in research to develop expertise, it feels great to bring that expertise to bear to advise agencies on their research programs, or guide them to use the best scientific evidence in achieving their missions.  I have had the opportunity to summarize the scientific evidence on the impacts of climate change on the oceans, describe how the approach of ecosystem-based management may guide agency activities, and summarize how STEM education can connect to ocean education for multiple agencies, among other topics.

Scientific advisory boards are a direct extension of the independent peer review process that has served science well for decades. So the community should not only embrace the use of these boards, but advocate for their independence from political influence fiercely.

Unfortunately, there are worrying signs that the new Administration may be on course to politicize the science advisory process. Recently, several members of the EPA’s Board of Scientific Counselors were not reappointed for the conventional second term (usual practice is to rotate off most boards after two terms). At the time, agency political appointees noted that more scientists employed directly by regulated industries should be appointed instead.

At the same time, meetings of advisory boards at the Department of Interior have been put on hold and their fate is unclear as of this writing.  Science advisory boards are in critical danger of being politicized. That is not necessarily unprecedented, but it hasn’t worked out well before, and will not again if allowed to stand.

So how do these boards work? In general, science advisory boards in any federal agency are under the auspices of the Federal Advisory Committee Act or FACA. That means that the process for nominating members for the board must be public, and the meetings of the board must be open to the public. Conflicts of interest for board members must be disclosed, meetings noticed to the public and opportunities for public comment during meetings provided. In general, boards operate on a consensus model, though that isn’t required.

In my time on various boards I have served with academic scientists, some from research institutes, industry and other non-governmental organizations. All the members were appointed for their scientific expertise, and most particularly, not to “represent” any particular interest organization or viewpoint. We always had open discussions about potential real or perceived conflicts of interest and got to know each other so that we could exchange views freely in the best tradition of science. We shared the (often substantial) work, including the writing of reports. Most importantly, the agencies seemed to directly utilize our work. I never felt that the time was wasted or the agency was just going through the motions of peer review just to tick a box on a form.

In the current political rhetoric, the call for “more scientists from industry” and “greater geographical representation” might make for a political talking point but it makes little sense if you want the best advice. The boards shouldn’t be for representing interest groups! States, tribes, industry, non-governmental organizations, and the public have other opportunities for input into the policy process. Those opportunities can be improved, extended, made more balanced (regulated industry plays the dominant role by far), but not by corrupting the process of obtaining science advice.

So what should the science community—and individual scientists—do? First and foremost, get involved! As noted above, nominations to boards, including self-nominations, are a public process. At the moment, there is a call for nominations for nine positions on the EPA Board of Scientific Counselors, and in September there is likely to be a call for as many as 15 positions on the EPA Science Advisory Board. Then there is the EPA Clean Air Science Advisory Committee and many other boards in other agencies across government. The Interior Department needs experts in ecology, endangered species protections, wildlife and fisheries management, climate impacts in diverse habitats, and many more disciplines. The Departments of Energy, Defense, Agriculture, and Commerce advisors play a role in shaping the large science programs in these agencies.

Do you think you might be willing and able to take on one of these important tasks? Do you know some other great scientists that could really have an impact? Then put forward your name and curriculum vitae. If scientists don’t stand up, then it will be all the easier to pack the boards with special interests.

So Stand Up For Science—and let’s make sure that science advisory boards really do advise our agencies with independent science.

 

Trump Budget Bares Wholesale Disregard for Environmental Justice Communities, But a New Bill Gives Hope

UCS Blog - The Equation (text only) -

Many low-income communities and communities of color in the U.S. have not always enjoyed the environmental and public health benefits of environmental safeguards. These communities and their advocates have long demanded redress of the unequal environmental burdens they experience, and had been hopeful that the progress made under President Obama’s EPA would continue and improve under a new administration.

But those hopes were recently dashed by President Trump’s proposed EPA budget, which allocates exactly zero dollars and zero cents to Environmental Justice programs within the agency. Where will the environmental justice EPA work be housed? There are few details, but the proposed budget says it will be incorporated into the Integrated Environmental Strategy (IES) program, which has provided assistance for environmental and public health initiatives in developing countries and has been supported by USAID. This clearly signals the incoherent position that the Trump administration sees environmental justice as a matter of foreign aid and development policy! Furthermore, as the IES program is housed under the EPA Office of the Administrator, it moves environmental justice matters closer to the sorts of political manipulation and disregard for science we have seen in Scott Pruitt’s EPA.

The callousness with which the Trump administration treats vulnerable communities on the frontlines of environmental contamination is not surprising, however. In a previous post, I warned of Pruitt’s appointment as ominous for environmental justice communities, and others called out the need for an EPA leader who would protect and strengthen the existing human and environmental health protections. The EPA’s long-time Environmental Justice assistant administrator, Mustafa Ali, recently resigned from his post at the agency after being dismayed at the deep cuts to grants and programs to safeguard our most vulnerable populations.

But among all of these recent actions that do not bode well for the environmental justice community, I am relieved to see that Representative Pramila Jayapal is standing up for vulnerable communities. Rep. Jayapal has just introduced a bill to establish an office of Environmental Justice at the EPA, and to create a small grants program.

How can communities benefit from an Office of Environmental Justice? Small grants to environmental justice communities foster the development of community-based partnerships that can improve environmental conditions in those communities, in areas like clean water and air, land revitalization, and environmental health. For example, a $25,000 grant was awarded in 2010 to a community learning center in Hawaii to educate the community on the public health and climate change challenges affecting the local community. This kind of program is essential to increase the capacity of low-income and minority communities to “create and implement local solutions to environmental justice concerns where they live,” and to reduce the environmental burdens to which they are disproportionately exposed.

We thank Rep. Jayapal for introducing legislation that will continue to benefit climate-vulnerable communities. The EPA itself knows of the benefits of this kind of program, as it keeps track of federal collaboration with environmental justice communities. That is great testimony to the multitude of fruitful partnerships that environmental justice communities and the agency had engaged in until recently. We need an EPA that values the health of people and the environment we all live in. Rep. Jayapal ‘s bill is a step in making sure our government fulfills its constitutional obligation to protect us all.

An Insider’s View on the Value of Federal Research

UCS Blog - The Equation (text only) -

Not long after receiving my doctorate in biochemistry I took a research position with the Agricultural Research Service (ARS), the main research arm of the U.S. Department of Agriculture (USDA). Prior to retiring in 2014 I had spent my entire career, 33 years, with ARS. I had a chance to see federal research from within the system. Contrary to what you may have heard, it’s been my experience that federal research is solutions-oriented, transparent, and nonpolitical.  

Mike and his technician, Karen Wagner, developing a new method for biodiesel synthesis. Photo: Agricultural Research Service, USDA.

Among the key aspects of that system were the following, which I believe pertain to federal research in general (exclusive, in some cases, of defense-related work):

  • It was problem-solving in nature, with research goals based in the country’s needs. These goals, for example, could be safer or more nutritious food, improved soil health, new uses for crops produced in excess of current needs, or any of a myriad of other topics.
  • There was daylight everywhere: Programs, goals, and outcomes were clearly published and publicized.
  • The work was not conducted to advance the sales of any commercial product, as some work in the private sector might be. It was problem, not profit, oriented.
  • The process had integrity and autonomy: Our results and conclusions were not dictated to us by management or the Administration.  During my career I published over 120 articles, including approximately 80 research papers in peer reviewed scientific journals, a dozen book chapters and half a dozen U.S. Patents. I gave over 100 oral presentations describing work.  Not one word that I authored was dictated to me by management. I am not aware of any colleague for whom that was not also true.
  • We were allowed and encouraged to patent any invention that we made. Patents were licensable under terms that were designed to aid the flow of technology to the private sector rather than to generate large sums of money for the inventor or the government.
  • We had full professional autonomy and were encouraged to interact with all parties (other than individuals and organizations from state sponsors of terrorism) as necessary to advance the work and disseminate its results. Among our partners were citizens as well as peers in academic, private sector or federal research, be they domestic or international, large or small firms. Large companies often have their own dedicated research and development teams, serving their interests. I came to see that in many ways we were the Research and Development team for the smaller firms and young industries – startups or small operations lacking the funds and staff to do dedicated research.  We collaborated with all comers, irrespective of size.
  • Research programs were up to 5 years in length, and continued beyond that if such could be justified. This led to the kind of long term, higher risk type of work that is in some cases needed and in many cases rare these days.
  • In cases of ‘crisis’ – some incident that needed a rapid research response (e.g. outbreak of a new plant disease, food poisoning incident….) – researchers were detailed into that area to assist in quickly developing appropriate responses to the threat.

Aerial shot of the Eastern Regional Research Center, USDA, near Philadelphia. Photo: Agricultural Research Service, USDA.

I spent my career at the Eastern Regional Research Center near Philadelphia, one of the ARS ‘Utilization labs’ that were built in the late 1930s as part of a major effort to develop new uses for the crops produced by America’s farmers. Out of this work have come thousands of research publications and patents, which developed or assisted in developing a host of new products and processes including dehydrated mashed potatoes (and hence Pringles!); soy ink; permanent press cotton fabric; frozen foods with increased retention of flavor, color and texture; Lact-Aid; and more efficient processes for the production of biofuels.

Filling up a truck on biodiesel. Photo: Spencer Thomas/CC BY 2.0 (Flickr)

The increased market share for biodiesel alone is a success for federal research. Beginning in the early 1990s, the desire to promote energy independence in this country and to provide new markets for our crops led researchers to begin exploring the production of what became known as ‘biodiesel’. Made from vegetable oils and animal fats, biodiesel can replace petroleum-derived diesel fuel while burning cleaner and thus reducing the emission of pollutants.  It was an obvious new outlet for U.S. lipids, and so my group and others in ARS began investigating various aspects related to its production and use.  Today biodiesel is an accepted fuel used throughout the country (and world), powering vehicles and generators and heating homes.  It is a true success story, one in which my lab, other USDA labs, and many other researchers played a part.

Based on my experiences, I see federal research as extremely valuable. As I have outlined above, it is dedicated to improving the quality of life of all Americans, and is conducted within a framework designed to maximize its integrity, reliability, impact and availability. It is also very efficacious: I am aware of two studies conducted during my career that assessed the economic impact of ARS research. These analyses determined that every dollar invested yielded between 14 and 20 dollars in benefits for the country. That’s a strong statement of the value of the work, a measure of what will be lost to all of us if programs are dropped, and a return on investment that I’ll sign up for any day.

 

Following receipt of a B.S. in Biochemistry from the University of Minnesota and a Ph.D. in Biochemistry from the U. of Wisconsin, Mike Haas went on to a career with the Agricultural Research Service of the U. S. Department of Agriculture.  During his over 30 years with ARS-USDA his research ranged from sophisticated studies of applied enzymology to the development of the simplest of methods for the production of biodiesel, a renewable fuel produced from U.S. farm products that both replaces and burns cleaner than petroleum diesel fuel.   During his research career Mike also served as an officer in relevant professional societies and as Associate Editor of a scientific journal.   Now retired, he serves as a student mentor with the National Biodiesel Board and, after 40 years in labs and offices, enjoys a multitude of outdoor activities.    

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Does Trump’s Paris Decision Make Him the Most Dangerous Man in the World?

UCS Blog - The Equation (text only) -

In case there was any doubt, President Trump’s June 1 announcement that he would take steps to withdraw the United States from participation in the Paris Accord confirms that our president is the most dangerous man in the world.  It isn’t just the substance of his decision (which will put the US into an exceedingly small club of nations—currently consisting only of Syria and Nicaragua—who have not signed on to the Accord) that is dangerous. It’s his disregard of the established science, his misunderstanding of the Accord’s structure, his misrepresentation of the consequences of both the US’s participation in the Accord and its withdrawal, and his lack of appreciation for the significance of America’s moral authority as the foundation of true greatness, that confirms this ignoble distinction.

Given the Accord’s reliance on voluntary national commitments, any of the President’s domestic policy objectives, however dubious and short sighted, could have been pursued within the Accord’s framework, making his decision to withdraw a gratuitous assault on the global order.

Unlike other possible contenders for the “Most Dangerous Man” title—such as Vladimir Putin, Kim Jong-Un, or Bashar al Assad—whose motives and methods are as transparent as they are reprehensible, it is extremely difficult to ascertain any rational basis for Mr. Trump’s actions. Without articulating a “reality-based” justification for withdrawing from the Paris Accord, Mr. Trump has ignored the wishes of a large majority of Americans (including “his generals”, the overwhelming majority of respected business leaders and even, reportedly, his own daughter) and risked causing incalculable but predictable harm to the health and safety of millions of Americans as well as the political and economic stability of the world.

Who will benefit from the president’s decision? For starters, Vladimir Putin and Kim Jong-Un will benefit from the weakening of the bonds between the U.S. and its historic allies, which (among other things) will make it more difficult to maintain unified sanctions against their violations of international law. Russia, Saudi Arabia, Iran and other petroleum-dependent countries, as well as a handful of American investors in coal assets, may enjoy a near-term benefit from the perception that the inevitable transition to a low-carbon global economy will be slower as a result.

China’s president, Xi Jing Ping, has already begun to fill the void in global leadership on the world’s biggest challenge created by Mr. Trump’s abdication of responsibility, and Chinese workers, investors and business leaders may prove to be the biggest beneficiaries of the estimated $1 trillion in new investments in renewable energy infrastructure which is predicted over the next five years as the 194 signatories to the Accord implement their pollution reduction targets.

Even before Mr. Trump’s announcement, China had committed to invest $120 billion/year for the next three years in renewable energy, which it believes will create 13 million new jobs and position China to capitalize on the global scientific and political consensus. Rather than China “stealing American jobs”, President Trump is effectively handing millions of jobs in the fastest-growing sector of the energy economy to China.

Many have noted that Mr. Trump’s budget proposal and initiatives on health care and taxes are designed to benefit America’s wealthiest families at the expense of middle class families, older Americans, low wage workers and communities suffering from the consequences of technological change and globalization, including many Trump voters. Unlike those policies designed to benefit virtually all of America’s most advantaged, withdrawal from the Paris Accord is only likely to benefit a small handful of American investors in coal assets, and given the strong market preference for natural gas and renewable energy alternatives to coal, even these benefits will be short-lived. Contrary to Mr. Trump’s dubious claim that his action will save “millions of US jobs”, the absolute number of coal mining jobs in the US (currently less than 70,000) is unlikely to increase meaningfully as automation in that industry will undoubtedly continue.

Who is likely to be harmed in the short term?  The 2.6 million men and women working in America’s burgeoning clean energy sector, including 1.2 million people in states that voted for Mr. Trump. Solar jobs alone are growing 17 times faster than the overall U.S. economy. To the extent Mr. Trump’s unnecessary and reckless decision impairs growth of the renewable energy sector in the US, the effect will be to hurt, rather than help, the US economy. And of course, any interruption in the momentum to curb global warming will have enormous adverse consequences on global health and stability over the long term.

Hopefully, the strong opposition to the President’s decision by America’s most successful and respected business leaders and informed voices among state and local governments and academia, when combined with the steadfast assurance by leaders of other global powers of their continued commitment to abide by the Paris Accord, will suffice to mitigate the worst consequences of his dangerously ill-considered act.  While it is impossible to quickly repair the damage to the reputation and moral authority of the US inflicted by the President’s decision, the prompt and virtually universal condemnation of his decision represents an encouraging sign that the global “immune system” is awakening to contain the risks posed by the fact that the Most Dangerous Man in the World has three and a half years left in his current term, and it’s impossible to predict what he might do next.

John H. Steed, J.D., retired as Managing Partner with the international law firm, Paul Hastings LLP. He has 31 years of experience in private practice after earning his degree from Harvard Law School in 1977. He has specialized in corporate law, financial services, and real estate transactions with private law firms in Orlando, Salt Lake City, Atlanta, and Tokyo. He continues to investigate opportunities for collaboration between US and Japanese companies in the fields of renewable energy generation and power storage.

The views expressed in Guest Commentary posts are those of the author alone.

Palo Verde: Running Without a Backup Power Supply

UCS Blog - All Things Nuclear (text only) -

The Arizona Public Service Company’s Palo Verde Generating Station about 60 miles west of Phoenix has three Combustion Engineering pressurized water reactors that began operating in the mid 1980s. In the early morning hours of Thursday, December 15, 2016, workers started one of two emergency diesel generators (EDGs) on the Unit 3 reactor for a routine test. The EDGs are the third tier of electrical power to emergency equipment for Unit 3.

When the unit is operating, the source of power is the electricity produced by the main generator (labeled A in Figure 1.) The electricity flows through the Main Transformer to the switchyard and offsite power grid and also flows through the Unit Auxiliary Transformer to in-plant equipment. If the unit is not operating, electrical power flows from the offsite power grid through the Startup Transformer (B) to in-plant equipment. When the main generator is offline and power from the offsite power grid is unavailable, the EDGs (C) step in to provide electrical power to a subset of in-plant equipment—the emergency equipment needed to protect the reactor core and minimize release of radioactivity to the environment. An additional backup power source exists at Palo Verde in the form of gas turbine generators (D) that can supply power to any of the three units.

Fig. 1 (Source: Arizona Public Service Company)

I toured the Palo Verde site on May 11, 2016. The tour included one of EDG rooms on Unit 2 as shown in Figure 2. Each unit at Palo Verde has two EDGs. The EDG being tested on December 15, 2016, was manufactured in 1981 and was a Cooper Bessemer 20-cylinder V-type turbocharged engine. The engine operated at 600 revolutions per minute with a rated output of 5,500,000 watts.

Fig. 2 (Source: Arizona Public Service Company)

Assuming one of the two EDGs for a unit fails and there are no additional equipment failures, the remaining EDG and the equipment powered by it are sufficient to mitigate any design basis accident (including a loss of coolant accident caused by a broken pipe connected to the reactor vessel) and protect workers and the public from excessive exposure to radiation. Figure 3 shows the major components powered by the Unit 3 EDGs—a High Pressure Safety Injection (HPSI) train, a Low Pressure Safety Injection (LPSI) train, a Containment Spray train, an Essential Cooling Water Pump, an Auxiliary Feedwater Pump, and so on.

Fig. 3 (Source: Arizona Public Service Company Individual Plant Examination)

Because the EDGs are normally in standby mode, the operating license for each unit requires that they be periodically tested to verify they remain ready to save the day should that need arise. At 3:02 am on December 15, 2016, workers started EDG 3B. Workers increased the loading on EDG 3B to about 2,700,000 watts, roughly half load, at 3:46 am per the test procedure.

Ten minutes later, alarms sounded and flashed in the Unit 3 Control Room alerting operators that EDG B had automatically stopped running to due low lube oil pressure. A worker in the area notified the control room operators about a large amount of smoke as well as oil on the floor of the EDG room. The operators contacted the onsite fire department which arrived in the EDG room at 4:06 am. There was no fire ongoing when they arrived, but they remained on scene for about 90 minutes to assist in the response to the event.

Operators declared an Alert, the third most serious in the NRC’s four emergency classifications, at 4:10 am due to a fire or explosion resulting in control room indication of degraded safety system performance. The emergency declaration was terminated at 6:36 am.

Seven weeks later after the fire had long been out, the oil on the floor long since wiped up, and all sharp-edged metal fragments long gone, and any toxic smoke long dissipated, the Nuclear Regulatory Commission (NRC) dispatched a special inspection team to investigate the event and its cause. The NRC dispatched its special inspection team more than a month after it authorized Unit 3 to continue operating for up to 62 days while its blown-up backup power source was repaired. The Unit 3 operating license originally allowed the reactor to operate for only 10 days with one of two EDGs out of service.

Workers at Palo Verde determined that EDG 3B failed because the connecting rod on cylinder 9R failed. It was the fifth time that an EDG of that type at a US nuclear power plant experienced a connecting rod failure and it was the second time that Cylinder 9R on EDG 3B at Palo Verde. It had also failed during a test in 1986.

Examinations in 2017 following the most recent failure traced its root cause back to the first failure. The forces resulting from that failure caused misalignment of the main engine crankshaft. (In this engine, the crankshaft rotates. The crankshaft causes the connecting rods to rise and fall with each rotation, in turn driving the pistons in and out of the cylinders.) The misalignment was very minor—the tolerances are on the order of thousands of an inch. But this minor misalignment over hundreds of hours of EDG operation over the ensuing three decades resulted in high cyclic fatigue failure of the connecting rod.

Workers installed a new crankshaft aligned within the tight tolerances established by the vendor. Workers also installed new connecting rods and repaired the crankcase. After testing the repairs, EDG B was returned to service.

NRC Sanctions

The NRC’s special inspection team did not identify any violations contributing to the cause of the EDG failure, in the response to the failure, or in the corrective actions undertaken to remedy the failure.

UCS Perspective

The NRC’s timeline for this event isn’t comforting.

The operating licenses issued by the NRC for the three reactors at Palo Verde allow each unit to continue running for up to 10 days when one of two EDGs is out of service. The Unit 3 EDG that was blown apart on December 15 could not be repaired within 10 days. So, the owner applied to the NRC for permission to operate Unit 3 for up to 21 days with only one EDG. But the EDG could not be repaired within 21 days. So, the owner applied to the NRC for permission to operate Unit 3 for up to 62 days with only one EDG.

The NRC approved both requests, the second on January 4, 2017. More than a month later, on February 6, 2017, the NRC special inspection team arrived onsite to examine what happened and why it happened.

Wouldn’t a prudent safety regulator have asked and answered those questions before allowing a reactor to continue operating for six times as permitted by its operating license?

Wouldn’t a prudent safety regulator have ensured the cause of EDG 3B blowing itself apart might not also cause EDG 3A to blow itself apart before allowing a reactor to continue operating for two months with a potential explosion in waiting?

Whether the answers are yes or no, could that prudent regulator please call the NRC and share some of that prudency? The NRC may be many things, but it’ll seldom be accused and never be convicted of excessive prudency.

Where’s a prudent regulator when America needs one?

The Ugly: Post #3 on the NNSA’s FY2018 Budget Request

UCS Blog - All Things Nuclear (text only) -

On Tuesday, May 23, the Trump administration released its Fiscal Year 2018 (FY2018) budget request. I am doing a three-part analysis of the National Nuclear Security Administration’s budget. That agency, a part of the Department of Energy, is responsible for developing and maintaining US nuclear weapons. Previously we focused on The Good and The Bad, and today we have The Ugly.

The Ugly NNSA’s “New” Warhead a Sign of Things to Come?

The NNSA’s FY2018 budget request includes what might seem to be a relatively innocuous statement:

In February 2017, DOD and NNSA representatives agreed to use the term “IW1” rather than “W78/88-1 LEP” to reflect that IW1 replaces capability rather than extending the life of current stockpile systems.

In other words, rather than extending the life of the W78 and W88 warheads via a life extension program (or LEP), the NNSA will develop the IW1 to “replace” those warheads.

To my mind, that is an admission that the IW1—short for Interoperable Warhead One–is a new nuclear weapon, as UCS has been saying for quite some time.

The Obama administration was loath to admit as much, arguing that the proposed system—combining a primary based on one from an existing warhead and a secondary from another warhead—was not a “new” warhead. That reluctance stemmed from the administration’s declaration in its 2010 Nuclear Posture Review (NPR) that the United States would not develop new nuclear warheads or new military capabilities or new missions for nuclear weapons. Declaring the IW1 a new warhead would destroy that pledge.

That semantic sleight of hand by the Obama team was somewhat ugly: the IW1 is a new warhead. (For a lot more detail on the IW1 and the misguided “3+2 plan” of which it is part, see our report Bad Math on New Nuclear Weapons.)

However, what might be coming from the Trump administration is truly ugly.

The fact that the FY2018 NNSA budget admits the IW1 is a new warhead may be signal that the Trump team—which is doing its own NPR—will eliminate the Obama pledge not to develop new weapons or pursue new military capabilities and missions.

That change would send a clear message to the rest of the world that the United States believes it needs new types of nuclear weapons and new nuclear capabilities for its security. This would further damage the Nuclear Non-Proliferation Treaty (NPT), which is already fraying because the weapon states are not living up to their commitment to eliminate their nuclear weapons. Deep frustration on the part of the non-nuclear weapon states has led to the current negotiations on a treaty to ban nuclear weapons. New US weapons could also damage our efforts to halt North Korea’s nuclear program and undermine the agreement with Iran that has massively reduced their program to produce fissile materials for nuclear weapons.

Moreover, a likely corollary of withdrawing that pledge would be to pursue a new type of nuclear weapon, or a new capability. Some options have already been suggested:

  1. The Defense Science Board recommended developing weapons with “lower-yield, primary-only options” (because the B61 bomb and the air-launched cruise missile already have low-yield options, this was presumably for missile warheads, though the report does not specify).
  2. The author of the Obama NPR—Jim Miller—and Admiral Sandy Winnefeld (USN, retired) have proposed reviving the submarine-launched nuclear-armed cruise missile that was retired in the Obama NPR.

Those options are contrary to US security interests. Nuclear weapons are the only threat to the survival of the United States. Given that, and because there will not be a winner in a nuclear war, the US goal must be to reduce the role that these weapons play in security policy until they no longer are a threat to our survival. Continuing to invest in new types of nuclear weapons convinces the rest of the world that the United States will never give up its nuclear weapons, and encourages other nuclear-weapon states to respond in ways that will continue to threaten the United States.

Make no mistake, the United States already has incredibly powerful and reliable nuclear weapons that would deter any nuclear attack on it or its allies, and it will for the foreseeable future.

So the idea that the United States should pursue new types of weapons? That is truly ugly.

There Are 68.4 Million Better Places for Solar Panels Than Mr. Trump’s Wall

UCS Blog - The Equation (text only) -

Yesterday President Trump suggested putting solar panels on his infamous border wall to help pay for it (since Mexico certainly won’t). While there are more things wrong with that proposal than I can cover in this space, it’s great to see that President Trump has finally figured out solar panels are cost-effective energy investments, paying for themselves even if you ignore the many environmental benefits. But here are more than 68.4 million better places for President Trump to invest in solar to pay dividends for the American people.

Solar on the roof

The US National Renewable Energy Laboratory (NREL) last year published a fine study of the potential of America’s rooftops to host solar. The researchers analyzed how much solar photovoltaic (PV) capacity we could get overhead, from existing buildings (considering roof orientation, tilt, and shading), and calculated what it would add up to.

One conclusion of that analysis was that “83% of small buildings have a suitable PV installation location,” and that more than a quarter of the total roof area of those buildings could work. NREL is careful to say that that’s the technical potential, not necessarily what would make sense in other regards. But if we take that 83%, and consider the number of stand-alone, single-family houses, you end up with 68,380,764 million (give or take a few million) places to put solar.

As it happens, a lot of those sunny rooftops are near our beautiful southern border:  In most Texas zip codes, for example, more than 90% of the small buildings might work for solar.

Source: Gagnon et al. 2016

From a technical potential point of view, residential rooftops across the country could meet a big chunk of household electricity needs in a lot of states: More than 90% in a dozen states, and at least 70% in 27 states.

Source: Gagnon et al. 2016

Solar on more roofs

But wait, there’s still more: Note that NREL’s “small buildings” doesn’t just mean detached single-family homes. If we add in duplexes and small apartment buildings, that would mean millions more rooftops for solarizing.

And then there’s plenty of roof space beyond small buildings:commercial, industrial, and institutional roofs. NREL found that “more than 99% of large and medium buildings” have some place that would work for solar (“at least one qualifying roof plane”). And the total rooftop area that would work is much higher than for small buildings (very few trees shading the middle of a big-box store roof…). Their calculations suggest potential on around half of the total roof area of medium buildings, and two-thirds of large ones.

If you take the rooftop potential across the various size buildings (which, unlike walls in the middle of deserts, are already connected to the electricity grid), and compare it even to the total electricity needs in each state, you find that it really adds up (particularly in states and cities that are serious about energy efficiency).

Source: Gagnon et al. 2016

Solar on the ground

Plus, roofs are definitely not the only place suitable for solar. The latest solar stats show that the progress of large-scale solar, done by utilities and others, has been even more impressive than residential and commercial (“non-residential”).

Source: GTM-SEIA Solar Market Insight, 2016 Year in Review

And large, ground-mounted solar arrays don’t just make sense in fields, farmlands, and deserts. Old landfills or “brownfields”—lands that have been degraded by past industrial activity—can be a great fit for new solar capacity. (The same could be true for solar at old power plant sites, where the plants have shut down but the infrastructure and grid connection are still there.)

Larger arrays can also be the foundation of community solar systems, a way of making solar work for people who can’t or don’t want to do it on the roof.

Solar in reality

So enough of the frivolous flights of folly in trying to use solar’s overwhelming popularity to make a wildly unpopular project slightly less unpopular. A border wall might need solar, but solar certainly doesn’t need a border wall.

Solar is real, and it makes sense. And we already have plenty of places to put it, if Pres. Trump would just put his office and budget to good use for moving American energy forward.

How the Size of Your iPhone Relates to Sea Level Rise

UCS Blog - The Equation (text only) -

Got your phone handy? Over the last month, coastal residents from Hawaii to Rhode Island wielded their smartphones and snapped dozens of shocking photos at high tide showing neighborhoods, parking lots, and public parks underwater. Meanwhile, scientists have published a spate of sobering sea level rise studies. We spend hours cradling our phones in our hands…let’s put them to use for a moment (screens off!) to put the latest sea level rise science into perspective.

How fast is sea level rising?

Hold your phone flat at eye level. If you’ve got an iPhone 6 or 7, it’s about 6 mm thick (Androids are a little thicker, at 7-9 mm). The latest research published in PNAS by Sönke Dangendorf and others, based on both tide gauge measurements and satellite altimeter data, shows that, averaged over the globe, the sea level is rising by just over 6 mm every two years. That amounts to 3.1 mm/yr.

There are a lot of wiggly lines here, but focus in on the solid black line labeled GMSL [this study, all corrections] in panel A. Over the course of the 20th century, the slope of that line–which is essentially what’s shown in panel B–increases. That’s the recent acceleration in the pace of sea level rise. Source: Dangendorf et al. 2017

Is sea level rising faster than it used to?

Yes. Take a look at the home button on your iPhone. Dangendorf’s study shows that over the course of the 20th century, sea level rose by an average of 1.1 mm/year. So every decade during the 20th century, sea level rose by the width of the home button on your phone (1.1 cm wide). And over the course of the 20th century, it would have taken about 6 years for sea level to rise by the thickness of your phone compared to just two years currently.

So the 20th century average sea level rise rate was 1.1 mm/yr, but now sea level is rising at 3.1 mm/yr. This means that, in the last 25 years or so, sea level rise has accelerated dramatically. Our appreciation of just how dramatic this acceleration is has been growing over the last few years.

As of the writing of the last IPCC report in 2013, the widely quoted 20th century sea level rise rate was 1.7 mm/yr. Compared to the present day rate of 3.1 mm/year, that implied some recent acceleration. But the latest estimates of 20th century sea level rise are significantly lower, which makes the difference between then (1.1 mm/yr) and now (3.1 mm/yr) starker.

Estimating 20th century sea level rise rates has long been a challenge for earth scientists. For one thing, sea level does not change uniformly around the globe. And tide gauges, which were the primary basis for sea level rise measurements until the early 1990s, aren’t evenly distributed on all coastlines. The farther back in time you go, the more these problems compound because there are fewer tide gauge records to rely on.

Locations of tide gauges around the world. Source: The Global Sea Level Observing System

So over the years, scientists have used a variety of methods to try to account for the spotty nature of tide gauge observations to come up with a single global average. Dangendorf’s new estimates are well-aligned with those published by Carling Hay and others in 2015 despite using very different methodologies, which suggests that we’re homing in on the right number for the 20th century, and experiencing a much faster rise in sea levels than our parents and grandparents.

Even faster sea level rise in store

The rapid sea level rise we’ve been experiencing for the last quarter century was the impetus for NOAA to revise its baseline estimate for future sea level rise through the year 2100. Back in January, NOAA released a new set of sea level rise projections that are being used for the Fourth National Climate Assessment.

This suite of projections was developed by NOAA scientists for use in the forthcoming Fourth National Climate Assessment. Source: Sweet et al. 2017

So pull out that phone again! The new lowest sea level rise scenario from NOAA projects 0.3 m (1 ft) of rise above 2000 levels by 2100. That’s just over the height of 2 iPhones stacked up end to end. And it represents a 10 cm increase compared to the projections used for the Third NCA report in 2014.

Even if global greenhouse gas emissions were to peak before 2020 and decline thereafter, NOAA scientists report that there’s a 94% chance of sea level exceeding that 0.3 m rise.

Because the most recent research coming out of Antarctica points to a potentially large contribution of Antarctic ice to sea level rise this century, NOAA has also added an extreme sea level rise scenario that projects 2.5 m (about 8 ft) of rise. That’s 18 iPhones stacked up end-to-end, which would reach from the floor to the ceiling in an average room.

What that all spells is more coastal flooding

Several studies have shown that sea level rise in the coming decades will increase the frequency of “sunny day” flooding of the sort that much of the US experienced last weekend during king tides. But recent research by Sean Vitousek and others highlights just how little sea level rise it takes to cause drastic changes in coastal flooding in other parts of the world.

With just 10 cm of sea level rise—less than the length of your iPhone—coastal flood frequency in the tropics would double. If we were lucky enough to continue on the sea level rise trajectory we’ve been on, that 10 cm rise would take place in about 30 years’ time. But most projections suggest a continued acceleration of sea level rise such that we could reach that 10 cm mark much sooner, and cities like Annapolis, Maryland, and Charleston, South Carolina are already starting to prepare.

The yellow and red areas of the map show places where a 10 cm or less increase in sea level–less than one iPhone length–would double the frequency of coastal flooding. Source: Vitousek et al. 2017

While relatively well-heeled places like Miami Beach have been investing heavily to reduce recurrent flooding issues that have long plagued the city, many low-lying tropical island communities have fewer resources to invest in flood mitigation measures. With the US exiting the Paris Agreement and reneging on its pledge of $3 billion to the Green Climate Fund, developing countries will have even fewer resources available to protect themselves from the floods to come. With the budgets for FEMA, the EPA, and the Department of the Interior in the crosshairs, communities here in the U.S. could be operating with limited resources as well as they watch the water rise.

Some politicians are taking note of the US’s coastal flooding problems and introducing legislation that would help.

Dangendorf et al. 2017 The Global Sea Level Observing System Sweet et al. 2017 Vitousek et al. 2017

The Ill-logic of Alternative Facts (sic)

UCS Blog - The Equation (text only) -

Philosophers of science are always on the lookout for the logic underlying the successful practices of the scientific community.  For us, that is a window into epistemology more generally, how humans manage to acquire knowledge of nature. The recent surge of “alternative facts,” “fake news,” and claims that accepted science is a “hoax” propagated inside some conspiracy is not just disturbing, but threatens to undermine the hard-won authority of scientific facts. What’s going on, logically speaking, beneath the surface of these attacks?

Webinar Today: Scientific Facts vs. Alternative Facts

How can we understand and respond to “alternative facts” when they are presented as of equal value as scientific facts? The UCS Center for Science and Democracy joins with the Philosophy of Science Association to invite you to participate in a webinar to investigate the differences between scientific facts and so-called alternative facts.

Sign up to participate >

The phrase “alternative facts” was introduced by Kellyanne Conway to describe false claims by Sean Spicer about the number of people who attended Trump’s inauguration. While we might agree with Chuck Todd that “alternative facts are lies,” for a philosopher it is important to understand how they work in order to know how to respond to the challenge they present to legitimate facts. Appeals to “alternative facts” reveal a pattern of reasoning that is in stark contrast to the ways in which scientific facts are supported. What’s the difference?

Science comprises a set of practices that generate our most accurate views of what nature is like. That is why we appeal to scientific results to guide our choices of what materials to use to build a bridge or what drugs to take to treat a disease. Humans have their limitations: our first impressions are often wrong, and our in-house perceptual and cognitive abilities are not as acute or unbiased as what we can get by outsourcing to computers or to microscopes, telescopes, spectroscopes etc. The natural conditions we initially confront may obscure causes and confounding influences, and so science crafts experiments that strip away the clutter to expose the main effects, the most relevant variables, the predictive features.

The justification of the results of science is a community affair, founded on critical examination by replication, peer review, and multiple forms of checking structured by the assumption that any fact, data, explanation, hypothesis or theory might well be false or only an approximation of the truth. Science works because it is rigorous in these ways, and that’s what warrants its authority to speak truth to power (or to wishful thinking, or to non-empirically supported beliefs).

The rigorous practices of the scientific community are founded on the most objective procedures humans can implement. The life history of a scientific fact might begin with a hypothesis, or a hunch, or a new application of a well-accepted theory, but to mature into a fact it must pass through the gauntlet of experiment, replication, critical challenge and scientific community skepticism. The logic of accepting a scientific fact goes as follows: If there is good, reliable evidence for it, then it will be accepted (as long as there is not better evidence for a different claim). Its persistence as an accepted fact is not guaranteed, however, as new challenges must be survived when new data, new ideas, or new technologies suggest refinements or adjustments.

Alternative facts follow a different course. They might also begin as a yet-unsupported hypothesis of how things are—how large a crowd might be, how humans might not be causing climate change.  But then the life history looks very different.  Rather than appealing to objective means of determining IF the world matches the hypothesis, purveyors of alternative facts instead consult their ideological, economic or political interests.  Non-objective procedures kick in to cherry-pick data, appealing only to what supports the hypothesis, ignoring or debunking data that contradicts it.  The critical scrutiny of the scientific community is replaced by the sycophantic agreement of those that share ideological, economic or political interests (e.g. “people are saying….”).

The ill-logic of accepting an alternative fact (sic) goes like this. If the hypothesis conforms to one’s interests, accept it as a fact and barricade it from any impugning evidence. If there is some isolated evidence that supports it, treat that evidence as definitively confirming. If there is evidence that contradicts it, ignore, debunk, or deny that evidence. If others who share the same interests voice support for the hypothesis, treat that community as a justifying consensus that the world is the way that group wants it to be.

In short, alternative-fact logic replaces evidence of how nature is with personal preferences for how I want the world to be. Data from experiment or observation, and survival of critical challenges by replication, meta-analysis and peer review, are replaced by what “fact” would be best to increase profits (smoking isn’t addictive), or reduce the need for regulation (CO2 is not a major cause of climate change), or bolster some ideology (most Syrian refugees are young men).

By misappropriating the language of “fact,” this practice undermines the authority of science to speak for nature. Policies that should answer to the facts are no longer constrained by the non-partisan procedures of testing and critical challenge. Instead they are guided purely by partisan interests.  The acceptance of scientific facts is not determined by how we want the world to be. The acceptance of alternative facts is determined exclusively by those preferences.

The consequences of treating “alternative facts” on a par with scientific facts can be dire. The claim that the measles, mumps, rubella (MMR) vaccine can cause autism was proposed in 1998 by Andrew Wakefield, a UK doctor, reportedly based on faulty analysis and a financial conflict of interest. Wakefield had developed his own measles vaccine and was funded by those suing the producers of MMR. His paper was later retracted and his medical license revoked, but his “alternative fact” continues to be promoted and believed.

Dozens of scientific studies have shown no relationship between MMR and autism, but do show that the vaccine is 93%-97% effective at preventing measles. In the decade prior to the introduction of the vaccine in the US in 1963, millions contracted the disease, and an estimated 400 to 500 people died from measles each year. By 2000 measles was no longer endemic in the US. One study estimates that between 1994-2013, 70 million cases of measles and 57,000 deaths were prevented by the vaccine. In recent years there has been a rise in measles in the US, with the majority of cases occurring in unvaccinated individuals. In 2014, 85% of those who got measles declined vaccination due to religious, philosophical or personal objections.

People may choose what they want to believe, but they do not get to choose the consequences of those beliefs. Because measles is so highly contagious, it takes 90-95% of a population to be immune to protect those who are vulnerable (too young or medically compromised to be vaccinated). Relying on “alternative facts” about measles vaccines by even a small percentage in a community can have harmful effects on those who cannot choose.

By exposing the underlying logic of defenses of “alternative facts” we can move beyond the standoff (that’s your fact, this is my fact) to a conversation about what counts as evidence, and how it contributes to what we should believe about nature. Do you really want the pill you take for hypertension to be the one that most increases profits, rather than the one that is most effective and has the least side effects?

 

Sandra D. Mitchell is professor and chair of the Department of History and Philosophy of Science at the University of Pittsburgh and is the President of The Philosophy of Science Association.  She teaches courses on philosophy of biology, the epistemology of experimental practices, morality and medicine, and practices of modeling in science.  Her research has focused on the implications of scientific explanations of complex systems on our assumptions about nature, knowledge and the ways to use knowledge in policy.  She is the author of Unsimple Truths: Science, Complexity and Policy (2009)

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

7 States Give Pruitt an “F” in Science, Challenge EPA Over Pesticide That Harms Children

UCS Blog - The Equation (text only) -

Back in March, EPA Administrator and science skeptic Scott Pruitt ignored his agency’s own science when he canceled a planned ban on chlorpyrifos, a well-studied pesticide that has been shown to damage children’s developing brains and make farmworkers sick. But the fight to protect kids and workers from this toxic pesticide isn’t over. In a welcome new twist, the Attorney General of New York and his counterparts in six other states announced today that they have filed an objection with the EPA for its inaction.

Joining New York Attorney General Eric Schneiderman in the legal challenge are the Attorneys General of California, Maine, Maryland, Massachusetts, Washington, and Vermont. They charge that the EPA “failed to make a key safety finding needed to continue to allow levels of chlorpyrifos, a common agricultural pesticide, on fruits and vegetables consumed by the public.  The federal Food, Drug, and Cosmetic Act (Food Act) requires EPA to revoke allowable levels—or ‘tolerances’—for pesticide residues on foods if the Agency is unable to determine that the levels are safe.”

Chlorpyrifos has been studied for decades and increasingly regulated, but it’s still used on a variety of fruits and vegetables—including apples and broccoli—that millions of American moms and dads feed their kids every day. The EPA was all set to ban those last uses due to the pesticide’s ability to damage children’s developing brains, when Pruitt abruptly changed course.

The announcement of the states’ lawsuit comes as the saga of this pesticide continues to grow. Chlorpyrifos reportedly poisoned nearly 50 California farmworkers in an incident near Bakersfield in May.

And in another troubling development last month, Pruitt also put the kibosh on a planned proposal to ensure that pesticides including chlorpyrifos are safely applied. That proposal was supposed to regulate “restricted use pesticides,” defined by the EPA as having the “potential to cause unreasonable adverse effects to the environment and injury to applicators or bystanders without added restrictions.” It would have required workers handling such pesticides—including chlorpyrifos—to be at least 18 years old and to have regular safety training.

 

 

 

 

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs