Combined UCS Blogs

Lessons for Fighting the Trump Administration’s Attacks on Science

UCS Blog - The Equation (text only) -

With all the recent headlines about the Trump administration’s attacks on the government scientific enterprise—from dismissing scientists from advisory committees, to hiring untrained or conflicted heads of agencies, to blatant misinformation from administration officials—it can be difficult to think about the solutions. But we must. My new paper, out this week in Conservation Biology, does just that. 

The red-legged frog was one of several species that got caught up in politics during the George W. Bush administration in the US. Several administrations in the US, Canada, and Australia have had issues with political interference in science policymaking. Photo: USFWS

While many of the Trump administration’s attacks on science seem unprecedented, we can draw many lessons from past administrations’ hostility toward science—both in the United States and outside of it.

In “Defending Scientific Integrity in Conservation Policy Processes: Lessons from Canada, Australia, and the United States,” my coauthors and I lay out lessons for advancing scientific integrity in government science policy, with a focus on conservation policy processes.

The paper is being released just proceeding next month’s meeting of the Ecological Society of America meeting, where I’ll be moderating a panel on current attacks on the Endangered Species Act and how scientists can engage in policy decisions.

Defending scientific integrity in conservation policy

Here are some of the paper’s key findings for what governments should do to advance scientific integrity in decision-making around conservation:

  • Strengthen the policies that grant government scientists the right to speak
  • Guarantee public access to scientific information
  • Strengthen agency culture of scientific integrity
  • Broaden the scope of independent scientific reviews
  • Enhance transparency around conflicts of interest around scientific advice
  • Proactively engage with scientific societies
The “political interference in science” playbook

While many of President Trump’s recent moves have raised concerns about the future, when it comes to the administration’s treatment of science, we must remember that in many ways we’ve been here before—both in the US and outside of it.

In terms of conservation policy, we don’t have to look to far to find past examples of interference in government decision making. Many of you may remember Julie MacDonald, the political appointee from the George W. Bush administration who (among other offenses) tampered with a scientific document supporting an endangered species listing for the Gunnison sage grouse.

This was just one of several examples of political interference in US endangered species policies. Other species that got tied up in politics at the time were the bull trout, right whale, marbled murrelet, trumpeter swan, polar bear, and red-legged frog. Canada, in the Harper administration, and Australia, under the Howard administration, also experienced political interference in government science.

Many of these cases demonstrate the same thing: that political forces can exploit weaknesses in the policy process in order to sideline inconvenient science. A lack of transparency in the process, inappropriate access to scientific documents by political officials, lack of access to government scientists, and lack of collection of or adherence to science advice, for example, can create conditions that make it easier for politics to intrude in what should be science-based decision making.

Learning lessons, finding solutions

During and following the many and varied attacks on government science under the George W. Bush administration, the Union of Concerned Scientists and others got to work developing policy solutions. It was clear that the system had vulnerabilities that allowed such interference to happen. What kinds of policies could have prevented such blatant intrusions of politics into science policy making?

There were lots of issues to address. But we learned a thing or two. Canada and Australia, too, learned from their former leaders who were hostile to science.

The paper lays out some of the solutions found to be common among the three countries. The lessons teach us that while damage can certainly be done during such times where science is silenced, sidelined, or manipulated; there are ways to move forward and policies that can be put in place to prevent such future abuses of power.

The path forward: keeping science in conservation policy

While we might fear what the Trump administration will do when it comes to conservation policy, in many ways we know the playbook. We know what to watch for and where the vulnerabilities are. We also have new protections and a federal workforce who intends to do their jobs.  So let’s continue to think about and advocate for solutions. It is our only hope.

Rising Seas Erode Homes and History in Alaska—Let’s Talk Relocation

UCS Blog - The Equation (text only) -

Every sourdough tastes unique.

Sure, they all share the same foundational ingredients – water, flour, and sugar – but the wild yeasts and bacteria that ferment the sugar to create that tart flavor are place-specific. A sourdough baguette from San Francisco, with the Pacific Ocean’s salty breeze sneaking into pantries, tastes different than that of a boule baked at the high altitude of a Denver bakery.

The best sourdough I’ve ever tasted comes from an unassuming bucket tucked underneath Cliff Weyiouanna’s kitchen sink in Shishmaref, Alaska.

The dough is nearly a century old (96 years to be exact), and has been passed down from one generation to the next. It’s rich and tangy in flapjack form, and fills the house with a fresh yeasty smell as Cliff flips them over the gas stove. Cliff has cooked pancakes for hundreds of visitors to Shishmaref – he has the guest books to prove it. Scientists from Japan, hunters from Texas, journalists from Norway, you name it and they’ve sat at Cliff’s breakfast table for a tall stack of sourdough jacks and a strong cup of coffee.

I sat at that kitchen table on a brisk August morning in 2016 with my research partner, Cliff’s girlfriend, and Cliff. The Summer Olympics were playing on a small TV in the corner and freshly picked buckets of berries crowded the floor space waiting to be frozen for winter as we tucked into steaming plates of pancakes.

They were delicious, but we weren’t there for the sourdough jacks. We had come to Shishmaref as part of a year-long research project to understand how erosion and sea level rise are affecting communities across the United States and US Territories.

Through interviewing hundreds of Americans from Alaska to American Samoa, our aim was to identify what is needed at the national level to support towns in need of relocation away from America’s eroding edges.

A few days before Cliff invited us for breakfast, the village of Shishmaref decided in a 94 to 78 vote to relocate in full from its current site onto the Alaska mainland five miles away. Shishmaref sits on a narrow barrier island in the Chukchi Sea. At points, the island measures barely a quarter mile wide.

Shishmaref has been losing land to the sea from natural erosion trends for hundreds of years. But with climate change, that natural trend is getting a lot worse.

The U.S. Army Corps of Engineers constructed a rip rap sea wall to protect much of Shishmaref from 2005 to 2009. The project is the latest in a number of sea walls constructed to try to slow the rate of erosion on Sarichef Island. Photo: Eli Keene

Relocation is now

Climate change is amplified in the Arctic with air and sea temperatures warming twice as fast as most other places on earth and Alaska is no exception. A primary reason for amplification is the surface albedo feedback – the melting of snow and ice that turn white, sunlight reflecting surfaces into darker, heat absorbing spots.

All that heat is melting ice and thawing permafrost, frozen ground, at an unprecedented pace. In normal years in Shishmaref, an icepack usually develops in the fall months around the island. This ice has always acted as a buffer against severe storm surges, forcing waves to break miles off shore instead of against the village.

As the ice disappears, so too does this natural defense.

This loss of ice – combined with the effects of thawing permafrost, softening the very land the village is built upon – have resulted in a loss of three to five feet of shoreline per year, with a single severe storm washing away 50 feet of land. Storms caused such severe erosion in 1997 and 2002 that some homes fell into the ocean and several more needed to be moved.

Talking to Cliff about the recent vote, it’s clear that he’s had this conversation many times before. At 74, he’s witnessed a lot of talk about relocation, including an effort by the community to relocate in 2002 which was later abandoned due to lack of funding.

“When people asked how did I vote, I say, ‘I know we’re not going to get funding from the state or government so I voted to stay,’” he tells us. “I don’t know [if they’ll find funding]. It’s gonna take a long time. They have to go back and test the soil. Last time they had someone from the government drill test every mile. There was three feet of soil and two feet of ice there, so that wasn’t stable. Now they have to work on building the road. It ain’t going to be easy.”

It’s easy to fault Cliff’s vote to protect in place rather than move Shishmaref. An Army Corps of Engineers Alaska Village Erosion Technical Assistance program assessment in April 2006 estimated that Shishmaref had 10 to 15 years before their current site is lost to erosion. And while a recently built gabion seawall will buy the village time, the threat of inundation is imminent and inevitable.

But it’s been 10 months since I was last in Shishmaref. 10 months since their vote to relocate, and no money has materialized to help them move.

Acting City Clerk, Tiffany Magby, reads out the final votes cast in Shishmaref’s 2016 referendum on relocation. Photo: Eli Keene

Lacking federal support for Alaska  

The cost of relocating Shishmaref in full is estimated at $180 million. And Shishmaref isn’t alone. In Alaska, 31 villages are identified by the Army Corps as in imminent threat of being uninhabitable. In Louisiana, Washington, Virginia, and Florida, coastal communities are already having difficult conversations about when managed retreat inland should become their climate change adaptation strategy.

At present, there are at least 13 towns and villages in America that have decided to relocate in part or in full due to the effects of climate change.

These towns may be the first to relocate from rising tides – but they won’t be the last.

UCS’s recent report When Rising Tides Hit Home calculates that within 45 years, by 2060, more than 270 coastal US communities – including many that seldom or never experience tidal flooding today – will be chronically inundated given moderate sea level rise. By the end of the century, that increases to 490 communities, including 40 percent of all East and Gulf Coast oceanfront communities.

At the onset of our project, the aim was to pinpoint particular policy and funding solutions to encourage federally supported, locally implemented climate-induced relocations that would feed into work already being supported by the White House.

We had hoped to provide our findings to an interagency working group on community-led managed retreat and voluntary relocation that President Obama established to develop a framework and action plan for managed retreat.

I wish that this intended goal was still possible. Unfortunately, it is not.

It is clear that the Trump Administration is not interested in protecting the American citizens in Shishmaref, or in any other coastal town across our country, from the effects of climate change we can no longer avoid. His proposed budget plan eliminates key programs for coastal adaptation research and capacity building like the National Sea Grant College Program; zeros out the budget for the Denali Commission, the independent federal agency mandated to facilitate climate-induced relocation in Alaska; and cuts dozens of EPA programs, including infrastructure assistance to Alaska Native villages.

While these actions can be demoralizing, we cannot give up on pressuring this Administration to act on climate adaptation and relocation. President Trump may not believe in protecting American citizens from the impacts of a warming world. But there are hundreds of civil servants and scientists who are still dedicated to helping those in need.

Civil servants like Joel Clement, former director of the Office of Policy Analysis at the U.S. Interior Department. Joel has worked for seven years to help endangered communities in Alaska like Shishmaref prepare for and adapt to climate change.

But last week, Mr. Clement was reassigned to an ill-fitted role in the Office of Natural Resources Revenue as retaliation for speaking out publicly about the dangers that climate change poses to Alaska Native communities. As he says in a recent op-ed in the Washington Post, “During the months preceding my reassignment, I raised the issue with White House officials, senior Interior officials and the international community, most recently at a U.N. conference in June.”

Federal scientists like Joel need our support and advocacy now more than ever before.

We must stand up for science and hold the Trump Administration accountable for silencing civil servants from doing their jobs. That means calling on our elected officials to join together to support empowered communities like Shishmaref to rise above the rides of climate change.

Shishmaref, as seen from just south of the sea wall.Photo: Eli Keene

Solutions for Alaska

Republican Senator Lisa Murkowski of Alaska recently spoke in Juneau, the state’s capital, about the need for climate change action “because we see it here in this state and it is real and I think we’ve got an obligation to help address it.” And last year, Senator Murkowski supported President Obama’s request for $400 million “to cover the unique circumstances confronting vulnerable Alaskan communities, including relocation expenses for Alaska Native villages threatened by rising seas, coastal erosion, and storm surges” in his final budget request to Congress.

Senator Murkowski’s proposed Offshore Production and Energizing National Security Act of 2017, or the OPENS Alaska Act of 2017  primarily aims to increase offshore oil production. But the bill would also direct 12.5 percent of the revenues from offshore development to a newly established Tribal Resilience Program to promote resilient communities through investments in energy systems and critical infrastructure to combat erosion, improve health and safety, and foster resilient communities.

Alaska Native communities on the frontlines of climate change need Senator Murkowski to be much more of a leader on this issue. They need her to educate her senate colleagues on the impacts they are already facing, and champion a strong, well-funded national climate relocation framework.

We also need to begin a conversation about non-governmental solutions to supplement federal and state action on climate-induced relocation in America. We must call on all sectors – private, non-profit, volunteer, philanthropic – to join together to support empowered communities like Shishmaref rise above the tides of climate change.

For example, the Rockefeller Foundations 100 Resilient Cities initiative, which was launched in 2013, aims to help cities around the world become more resilient to the physical, social, and economic challenges that are a growing part of the 21st Century.

Another example is Community Engineering Corps, a partnership between Engineers without Borders, the American Society of Civil Engineers, and the American Water Works Association to bring underserved communities and volunteer engineers together to advance local infrastructure solutions in the United States.

NGOs with legal expertise or pro bono divisions of large law firms could provide partnerships to help towns navigate the legal challenges of retreating inland. And places like the National Trust for Historic Preservation and the Society for American Archeology could help to ensure that cultural heritage, historic sites, and local traditions are included in the relocation road map, effectively protecting them, or documenting them with dignity when saving them is not possible.

Edwin Weyiouanna, an award-winning carver from Shishmaref, works on a piece of moose antler. Photo: Eli Keene

Cultural diversity for climate resilience

Ultimately, when I think of the future of coastal communities as seas rise, I don’t think of DC; I think of Cliff and the fresh yeasty smell of sourdough flapjacks for breakfast.

Maybe it’s strange that sourdough is the first thing that comes to mind when I think of the impacts of climate change on America.

At first glance, the biggest loses from climate change are the ones we can see. They are the fallen house into the ocean, the flooded streets after the hurricane, the disappearing edges of America on the maps of our country. These tangible images are what we recall when we think of what stands to be lost as the seas rise.

But there are some things that can’t be rebuilt – the place that you learned how to plant tomatoes with your grandfather that’s now too salty to grow vegetables. The historic buildings that have stood for centuries now under water. The identity of your town as a seaside community and the close-knit bonds within it that let you ask your neighbor to water your plants when you go on vacation.

The unique taste of sourdough that’s been living on Shishmaref for 96 years.

Residents learn to make traditional handicrafts from seal skin at a free workshop. Photo: Eli Keene

This – these local cultures and heritage – this is what the hundreds of people I’ve interviewed spoke about when asked what they are afraid of losing to encroaching seas. And it’s what I think about when President Trump slashes all funding support for protecting American communities from climate change.

The mass loss of history and cultural diversity may seem less important than the billions of dollars of infrastructure damage climate change will cause. But losing our cultures and histories isn’t just about losing part of who we are. It’s also about losing part of our ability to adapt to a warmer world.

Just as the biodiversity of plants and animals improves the resilience of ecosystems, cultural diversity offers a resilient knowledge base for adapting to and counteracting the effects of climate change.

Learning from traditions and history has always been an important part of envisioning a better future. Using cultural practices that have been passed down from generation to generation to adapt to a changing climate is no different.

Cliff’s sourdough passed down from his parents may not help in Shishmaref’s adaptation, but the lessons they passed down about reading the safety of ice conditions will.

Coastal communities across America already have the vision and multigenerational knowledge to adapt to the effects of climate. What they do not have is time to waste on an inactive government. They need the financial support and technical tools to implement their vision. Those of us in privileged positions need to pressure the Trump Administration and Congress to take seriously the issue of relocation before it’s too late.

Victoria Herrmann is the principle investigator for America’s Eroding Edges, a research and storytelling project on the impacts of climate change on coastal communities  livelihoods, and cultures. She is also the President & Managing Director of The Arctic Institute and a Gates Scholar at the Scott Polar Research Institute at Cambridge University.  

Electric Cars Are Critical to a Clean Future

UCS Blog - The Equation (text only) -

Electric vehicles (EVs) are an important part of how we will reduce climate-changing emissions, air pollution, and petroleum consumption. Are they the only way we will cut pollution from personal transportation? Of course not. EVs are critical, but we’ll also need to be smart about using urban design, transit, and shared mobility to reduce the amount of driving from all vehicles. However, a recent U.S. News & World Report article puts EVs in a false competition with these other strategies, while also repeating myths about the environmental impacts of EVs.

EVs reduce emissions now

On average, EVs on the road today produce less global warming emissions than the average new gasoline car.

The emissions do depend on where in the U.S. the EV is used, because electric power generation comes from different sources depending on the region. Because many of the EVs have been sold in regions with cleaner power (like California), the EVs being used today are, on average, responsible for fewer emissions than any gasoline-powered car.

Based on sales through 2016, the using the average EV is responsible for global warming emissions equal to that of a 73 MPG gasoline car.

EVs are still responsible for fewer global warming emissions, even when you consider the additional energy and materials needed to manufacture the batteries that power EVs. We found that these extra emissions are offset quickly by savings during use; on average after 6 to 18 months of use.

There are also other concerns mentioned in passing in the U.S. News article, such as the impact of mining for battery raw materials. But the negative impacts from raw material extraction are largely due to lax regulations and can be addressed through better policy and corporate responsibility.

For components like cobalt and rare earth metals, all high-tech consumer product companies need to ensure that they have environmentally responsible supply chains that also protect the rights and health of those impacted by mining. This is as true for Apple and Samsung as it is for EV manufacturers.

There have been positive developments from batteries suppliers and technology companies, but they can and should do more to ensure responsible battery production.

At the same time we also need to consider the negative impacts of gasoline production, from human rights abuses to massive environmental disasters during oil extraction, to the unavoidable air pollution damage from refining and burning gasoline in our cars. All our personal transportation fuels – gasoline, diesel, biofuels, or electricity – can be cleaner if fuel producers are held accountable to reduce their pollution.

Moving to EVs faster will help to reduce emissions even more

Another attack on EVs in the U.S. News article is that EVs only make up a small fraction of the vehicles on the country’s roads today. This is true, but is not a reason to turn back. The first mass-market EVs only went on sale at the end of 2010. From those two models (Chevrolet Volt and Nissan LEAF), the market has now grown to some 30 EV models available today. However, many of these EVs are not sold nationwide and are not marketed effectively.

In one notable case, Fiat Chrysler has decided to not even let customers outside of California know that it’s new minivan comes in a plug-in version.

Still, EV sales are increasing and hitting new milestones, especially in places with strong regulations and incentive programs like California where manufacturers have also placed much more effort to sell EVs (when compared to the rest of the U.S.)

In the first quarter of 2017, EV sales in California were nearly 5 percent of all new car sales and for some manufacturers were much higher. For example, General Motors’ Chevrolet brand had plug-in cars make up over 15 percent of all new sales in the first 3 months of 2017.

Having more options for new car buyers to pick a plug-in car will only help make the market grow. And it’s important for the market to grow as quickly as possible. Because cars often stay on the road more than a decade, it’s critical to speed up the transition from petroleum to electricity.

The future is electric, but also needs shared transportation

The future of driving is electric. It’s not just our opinion at UCS, both car companies and governments realize that EVs are the future. CEOs of Ford and  VW have gone on record with predictions of high volume EV sales. And France, Norway, and India are among the countries that have set impressive goals to transition to EVs.

But EVs alone aren’t enough to meet our climate goals. It’s important to also reduce the impact from transportation by reducing the number of miles we drive, even from electric cars. Shared transportation, whether via transit, carpools, or new ridesharing services, will also be important to make significant reductions in pollution. But this is in no way in competition with EVs. Instead, EVs are complementary to many of these shared transportation options.




Seize the Day: RGGI Leadership More Important Than Ever

UCS Blog - The Equation (text only) -

A pioneering program to reduce power plant emissions in the Northeast is poised to enter a new phase. Here’s why the nine states of the Regional Greenhouse Gas Initiative need to make as bold a step forward as possible—and how they can make it happen.

The RGGI states in the Northeast and Mid-Atlantic—Connecticut, Delaware, Maine, Maryland, Massachusetts, New Hampshire, New York, Rhode Island, and Vermont—were right to lead the nation in addressing carbon pollution from power plants when they launched the program in 2009. And they were right to strengthen RGGI when they conducted the first program review in 2012.

RGGI has yielded clear results, as shown in analyses by the program itself and outside analyses.

Now the states are nearing the finish line of the second RGGI program review. And the need for stronger action by this important collection of states is even clearer.

Why it’s important

This is a time of incredible momentum in clean energy, with cities, states, utilities, companies, individuals, and more embracing so much of what’s possible. Technologies, policies, and actions are leading us to new heights on energy efficiency, renewable energy, and clean cars.

This is also a time of incredible need. With the Trump administration abdicating on climate leadership in pulling out of the Paris climate agreement, and working to kill the first-ever federal regulations on power plant pollution, the Clean Power Plan, regional leadership on climate and energy is more important than ever.

Technologies mean RGGI states can make much more progress, and quickly. (Credit: J. Rogers)

What RGGI state leaders must do

So this program review represents an incredible opportunity to strengthen RGGI. The Union of Concerned Scientists and its allies are calling for strengthening in several key areas:

A stronger target – RGGI’s defining characteristic is its declining regional cap on power plant carbon emissions. The region has a history of defining the cap much higher than circumstances—emissions and trajectories—warrant. And the emissions targets as currently set will not get the states the long-term emissions reductions that many have set via legislation, and that many of the region’s governors have committed to collectively.

In a recent letter to the RGGI governors, our coalition called on them to have the cap levels:

…reflect actual emissions levels and trends at its start, including emissions reductions that have outpaced earlier projections, and align with state and regional [greenhouse gas] targets in 2030.

What that means is something considerably stronger than the 2.5% annual tightening of the cap that has been in place since the program’s launch—at least 3.5% seems warranted—and an extension of the cap through at least 2030. The RGGI states, their utilities, innovators, and customers have proven their ability to cut emissions cost-effectively, and the program review is a chance to harness the power of that collective action.

Stronger complementary components – The program review is also an important chance for the RGGI states to strengthen other pieces of the program to help it live up to its true potential. That includes dealing with the surplus of emission allowances now in the hands of power plant owners; making sure that the protection against too-high prices is truly used for emergencies, not little price bumps; taking advantage of low allowance prices to move ahead more quickly; and making sure allowance prices don’t go too low. (See here for more details about each of these opportunities.)

Potential expansion – The successful RGGI framework could be expanded to other states (New Jersey, you’re welcome back anytime), or to other sectors such as transportation.

Stronger commitment to environmental justice – Last, but far from least, the program review offers an important opportunity not just to strengthen the program, but to ensure that RGGI’s benefits reach those who need them most—through consultation and through smart allocation of allowance revenues, for example. As our coalition letter says:

…the RGGI states must ensure that communities on the frontlines of the impacts of pollution and climate change have a say in how RGGI is implemented and how funds are distributed to ensure broad and equal opportunities to experience RGGI benefits. Strengthening the RGGI program in communities that bear the biggest burden of pollution is critical.

Bolder, stronger, further, fairer

The decision makers in the nine RGGI states have a golden opportunity to show real leadership on climate and energy, to put in place the stronger policies and targets that this time in history demands, and to seize all the rewards that come from boldness in climate action.

It’s up to you, RGGI leaders. Seize the day.

Remembering Herb Needleman—The Hero Who Got Lead Out of Gasoline

UCS Blog - The Equation (text only) -

Dr. Herb Needleman, a Pittsburgh pediatrician whose pioneering research into the toxic effects of lead on children led to the removal of lead from gasoline and other products, died last week at the age of 89. He was a tireless advocate for children’s health in the face of persistent attacks on his work and integrity from the lead industry. A decade ago, he showed up in my life in a pretty unexpected way.

In 2006, UCS  brought scientists from to Washington, DC to talk to legislators about the manipulation and suppression of science, and the consequences that has for public health and the environment. We recruited scientists from our Sound Science Initiative, the precursor to the UCS Science Network, and from the list of experts who had signed the scientist statement on scientific integrity that called on the Bush administration to restore scientific integrity to federal policymaking.

Dr. Needleman’s research transformed our understanding of the impact of lead on children’s developing brains, and led to the removal of lead from gasoline. Photo: NIEHS

Dr. Needleman was on that list of signers. Yet we clearly hadn’t done our research. We had no idea who he was. I only knew him as a pediatrician with a low voice and large glasses.

There was a lot behind those glasses. Others have written extensively about his experiences, including in The Lead Wars and this extensive interview, better than I ever could. But briefly, Dr. Needleman conducted extremely novel research in which he collected and measured lead levels in children’s primary teeth, demonstrating a correlation between intellectual development and exposure to lead, even at low levels.

A 1979 study he published in the New England Journal of Medicine transformed the field, set the stage for restrictions on lead in gasoline, and put a target on his back.

He should have been celebrated. He was not. “He was attacked by the lead industry, hounded by columnists, snooped after by hired investigators, had his files endlessly combed over by high priced consultants, and was indifferently supported by many of his colleagues at his university,” remembered Richard Jackson, former director of the CDC’s National Center for Environmental Health.  

For years, the lead industry and affiliated individuals did their best to tarnish his reputation and take him down through unfounded allegations of scientific misconduct. At times, he could not even depend on his university to play fair. Via The Lead Wars:

The attacks on Needleman’s research and on his scientific integrity culminated in 1991 when Claire Ernhart and Sandra Scarr filed charged with the Office of Scientific Integrity at the National Institutes of Health, alleging that Needleman had engaged in scientific misconduct…They demanded, and received, another inquiry, which led to Needleman’s own university, the University of Pittsburgh, to begin another investigation of his research. It was a ‘horrible’ period in his life, Needleman recalls. The university refused to allow him to bring in outside experts, though it called on others who had previous professional relationships with his accusers. The university initially refuse to open the hearings to the public; it took a petition campaign from scientists around the country to persuade the campus officials otherwise. He was shunned by colleagues who worried about being associated with him. But in 1992, despite some methodological criticisms, Needleman was vindicated of wrongdoing by the university and, similarly, three years later by the office of Research Integrity of the Department of Health and Human Services.”

Dr. Needleman, ever determined and courageous, emerged from these battles with his professional reputation intact. And eventually, he ended up in a UCS conference room. Here was a man who had testified before Congress, been attacked repeatedly in multiple venues, and seen the way that science is politicized at the expense of children. A titan. He could have trained us. Yet he came to Washington DC and patiently listened to our basic explanations of how the Bush administration was manipulating and suppressing science, and how we needed to educate Congress about the importance of providing oversight.

Dr. Needleman at a 1991 congressional hearing. Photo: C-SPAN

It turned out that I was assigned to accompany Dr. Needleman and other scientists to meet with lawmakers. In each meeting, when it was his turn, he spoke softly but firmly about how as a doctor he relied on the government to provide guidance about the safety and efficacy of drugs. He talked about the importance of protecting children from harmful contaminants, and about how he had witnessed the consequences of politics getting in the way of protecting children from lead poisoning in his earlier years.

Not once did he discuss his pivotal role in keeping millions of children—including me—safe. He didn’t want to take over the show. He wanted the story to be about public health and the environment.

It was months later when I came across his name while doing some research. The more I read, the more foolish I felt for not recognizing the legend among us. Yet I also felt honored to have had the opportunity to cross paths. And I felt thankful that such an accomplished man was still committed to fighting for what was right.

I often refer to Dr. Needleman in talks to help make the point that political and industry pressure on scientists is not new, and that many have pushed alternative facts for years to further narrow agendas at the public’s expense. I hope that people will continue to study his life and remember that we can persevere in the face of extremely powerful interests to make the lives of others better.

Six Months into the Trump Administration: Science and Public Health Under Siege

UCS Blog - The Equation (text only) -

My colleagues at the Union of Concerned Scientists (UCS) have released a report on how science—and public health—have been sidelined during the first six months of the Trump administration. The report documents a deliberate and familiar set of strategies that undermines the role of science, facts, and evidence in public policy and decision-making.

From a public health perspective, the short- and long-term impacts are truly frightening. The Trump administration—aided and abetted by a willing Congress—is actively pushing an ideological, anti-science agenda that will profoundly affect the health, safety, and security of children, families, and communities today, tomorrow, and for decades to come.

They claim their approach is pro-business, but on closer look, that isn’t true. It harms the many good business people who want to play by the rules and make a profit without harming the public or their workers. How? By giving an unfair advantage to unscrupulous businesses that will put profit ahead of public and worker safety and health.

Control, Alt, Delay: Public health protections on the chopping block

Mercury, lead, arsenic, ozone, beryllium, silica, chlorpyrifos. These substances all have several things in common:

  1. They have all been found to contaminate our air, water, soil, and/or food, as well as some of our workplaces and community environments.
  2. Robust and often long-standing science has proven that exposure to them can cause serious health effects, including death.
  3. Government agencies charged with protecting our health and safety have established rules and standards to prevent or minimize our exposure to them. (Note: these and other public health safeguards are increasingly denigrated as unnecessary regulations by the Trump administration and some in Congress.)
  4. Exposure standards established years ago have been found to be insufficiently protective.
  5. The Trump administration has taken steps to weaken, delay, and subvert recent science-based safeguards that enhance public protection from these toxic substances.

Make no mistake. There is an all-out assault on the agencies charged with using independent, unconflicted science to protect our nation’s public health—and on the critical resources and infrastructure they need to do just that.

The proposed draconian cuts to budgets, staffing levels, and programs at agencies like the EPA, CDC, FEMA,  NOAA, USDA, and OSHA speak for themselves. (And don’t even get me started on how current congressional efforts to reform health care will impact the health of our most vulnerable populations.)

But the real issue isn’t about protecting agency budgets or staff levels, essential as they are. It’s about protecting all of us from known (and emerging and future) threats to our health, safety, and well-being. What follows is just a snapshot of this administration’s siege on public health.

Children and women first? Not so much

Informed by a wealth of scientific evidence, and putting the health of children first, the EPA banned the indoor use of the pesticide chlorpyrifos back in 2000.

Chlorpyrifos, a potent neurotoxin, is known to affect brain development and cause developmental delays in children exposed in their homes, through their diets, and through their mothers in utero. It has also been shown to sicken farmworkers who apply it, and to contaminate drinking water supplies in farming communities.

In 2015, the EPA announced that it would revoke all tolerances for its use on or in food—essentially banning its use—noting that it was unable to find a safe level of exposure. On March 29, 2017, less than three months into the Trump administration and 20 days after meeting with the CEO of Dow Chemical, EPA Administrator Scott Pruitt rejected the advice of his  agency’s own chemical safety experts and reversed this decision—ironically noting that “By reversing the previous administration’s steps to ban one of the most widely used pesticides in the world, we are returning to using sound science in decision-making—rather than predetermined results.”

The American Academy of Pediatrics denounced the decision. In its June 27, 2017, letter to Mr. Pruitt, the medical association called the risk of chlorpyrifos to infant and children’s health and development “unambiguous” and urged the EPA to listen to its own scientists and go forward with the proposed rule to end its uses on food.

The next time EPA is required to review the safety of the chemical is five years out—2022 to be exact. Until then, children will be eating peaches, pears, broccoli, and other foods grown with chlorpyrifos, as will pregnant women who are unknowingly putting their babies at risk.  Farmworkers and their families will also continue to be exposed—but hey, Dow Chemical, and the smaller manufacturers, will have “regulatory certainty.”

A West Virginia Coal Miner sprays rock dust

A West Virginia coal miner sprays rock dust 900 feet underground. OSHA has estimated that about 2.3 million workers are exposed to respirable crystalline silica. Photo: courtesy of NIOSH

Protecting our nation’s workforce: Not in the cards

The assault on worker health is one that has particular meaning to me, having spent decades of my life specifically focused on occupational health and safety. And when I think about how long (decades) it takes to promulgate standards to protect worker health—even when their dangers have been known for eons—I am truly astounded by what the current administration and Congress have done.

Take silica. Its devastating health effects have been known for centuries. As far back as 1556, in his Treatise on Mining, Agricola described a pulmonary disease afflicting stone cutters and miners.  Bernardino Ramazzini, known as the father of occupational medicine, wrote about respiratory symptoms and sand-like substances in the lungs of stone cutters in his 1700 seminal work De Morbis Artificum Diatriba (Diseases of Workers). The 1936 Hawks Nest Tunnel Disaster at Gauley Bridge in West Virginia—one of the worst industrial disasters in US history—is as harrowing a tale of occupational exposure to silica as one would ever want to read about.

Silica causes lung cancer and silicosis, a disabling, non-reversible, and sometimes fatal lung disease, which can develop or progress even after exposure has ceased. OSHA has estimated that about 2.3 million workers are exposed to respirable crystalline silica, including 2 million construction workers who drill, cut, crush, or grind silica-containing materials such as concrete and stone, and 300,000 workers in general industry operations such as brick manufacturing, foundries, and hydraulic fracturing. Like most occupational illnesses and injuries, silicosis is preventable.

In keeping with scientific evidence, in 2011 OSHA sent a proposed tightening of its 40-year old silica standard to the Office of Management and Budget. In 2013, OMB gave OSHA the green light to actually propose the new rule, which OSHA promulgated as a final rule in March 2016, with an effective date of June 23, 2016. Industries were given different timelines to comply with most requirements—one year for construction, two years for general industry, and five years for hydraulic fracturing. In its fact sheet on the final rule, OSHA noted that “Many employers are already implementing the necessary measures to protect their workers from silica exposure. The technology for most employers to meet the new standards is widely available and affordable.”

So call me gobsmacked when in April 2017 OSHA decided to delay enforcement in the construction industry by another three months. That may not sound like a lot, but tell that to construction workers who may already have been breathing dangerous levels of silica dust for years. And call me crazy, but my confidence in OSHA sticking with even that schedule is somewhat shaken.

You’ll see why when you see how the administration has turned a two-month delay in implementing its new protective standards for workers exposed to beryllium into a proposal to “modify” (read “weaken”) protections for workers exposed to beryllium in construction and shipyards. I won’t rehash it here, as I’ve already covered it here and here.

The CRA could be used to undo important public health protections that are vital to protecting the most vulnerable populations.

The scientific evidence that ground-level ozone has serious health impacts is beyond dispute.

The EPA gives states a breather on ozone. People will suffer the consequences.

The EPA has been regulating ozone as a criteria pollutant since National Ambient Air Quality Standards were established in the Clean Air Act in 1970—and with considerable success.

Levels of ground-level ozone, the main component of in smog, have declined over time. Though many people still live in areas with unhealthy levels of ozone pollution, the decline is evidence that air pollution requirements are working.

Good thing, as the scientific evidence that ground-level ozone has serious health impacts is beyond dispute. It increases the frequency of asthma attacks, can cause chronic obstructive pulmonary disease (COPD), and worsens bronchitis and emphysema. It can increase risk of lung infections. And it has been associated with early deaths from cardiovascular disease. Children, the elderly, and those with respiratory and cardiovascular disease are especially vulnerable.

For some current context:

  • The American Lung Association reports that more than one-third (36 percent) of the people in the United States live in areas with unhealthy levels of ozone pollution. Approximately 116.5 million people live in 161 counties that earned an F for ozone in this year’s report.
  • CDC national and state surveillance systems estimate that 8.4% of children under the age of 18 have asthma and that 4.7% of children between 0 and 4 years of age currently have asthma. And the number of reported missed school days among children with asthma was 12.4 million in 2003, 10.4 million in 2008, and 13.8 million in 2013.

Nearly five years ago, in 2013, scientists on the agency’s Clean Air Scientific Advisory Board recommended that the EPA tighten its ozone standard, which was 75 parts per billion (ppb) at the time.  The experts recommended a range of 60-70 ppb, while noting that the upper level is likely not to provide the adequate margin of safety required by the Clean Air Act; that is, a 70 ppb standard was likely not protective enough of public health.

The politics involved in lowering the standard were fraught (no surprise here), with courts eventually weighing in. After years of foot dragging, the EPA issued a final rule (with a 70 ppb standard) in late 2015. In June 2017 the EPA decided to give the states another year to comply with the long-awaited standard, noting the increased regulatory burden and increased costs to business. But it’s not like states and businesses have not seen this coming. It’s been years in the making. Indeed, the EPA’s Clean Air Scientific Advisory Board first recommended a range 60-70 ppb back in 2007!

In the meantime, it’s people that will take the hit. Too many kids and others will continue to break out their inhalers, visit emergency rooms, and lose time at school and at work. The delay may work for some interests, but certainly not for public health.

And the latest: On July 18, the House of Representatives approved a bill that would delay enforcement of the EPA ozone standard until the middle of the next decade, giving companies an additional 100year reprieve on complying with the new ground-level ozone (a.k.a. smog) health standards. The bill also permanently alters the Clean Air Act’s timetable for updating air quality safeguards. It will likely face opposition in the Senate, but is just the latest signal that our air quality and public protections are under attack. (A similar provision has also been included in the House Interior spending bill as an ideological rider.)

Heavy metal:  Nothing to sing about

The serious and permanent consequences of lead exposure on children’s brains, development, and behavior have been known for decades. Its effects cannot be corrected; they will affect kids’ lives forever. Photo credit: CDC

It doesn’t take a toxicologist to know that heavy metals—like mercury and lead, and metalloids like arsenic—are not good for your health. Their ill effects have been known for hundreds of years.

In the 1800s, mercury exposure caused mad hatter’s disease in workers making felt hats. Industrial waste water containing methyl mercury was the source of two devastating outbreaks of neurological disease in Minimata, Japan in the 1950s and 1960s. And today, it’s common knowledge that eating fish and shell fish from mercury-contaminated waters pose dangerous risks to the developing fetus.

When it comes to lead, its serious and permanent effects on children’s brains, development, and behavior have been known for decades; even low levels have been shown to affect IQ, ability to concentrate and pay attention, language and communication fluency, and academic achievement.  And effects of lead exposure cannot be corrected; they will affect kids’ lives forever. (Lead toxicity has been known since ancient times; it was described by the Greek physician Nicander as far back as the second century B.C.)

Government regulations eliminated lead in household paint in 1978 and in gasoline in the late 1980s. The ongoing crisis in Flint brings lead exposure to the present day. The CDC estimated that more than a million US children had lead poisoning when, in keeping with the science, it lowered its definition of poisoning to 5 micrograms of lead per deciliter of blood in 2012.

Federal and state agencies provide lead screening, surveillance, and prevention programs, and continue to identify lead contamination in homes and children with unacceptable levels of lead in their blood. A quick web search will identify sources of state and even local data.  See, for example, here, here and here.

And about that metalloid (and human carcinogen) arsenic. In establishing its enforceable maximum contaminant level for arsenic in drinking water in 2001, the EPA reported that in the US, approximately 13 million individuals lived in areas with a concentration of inorganic arsenic in the public water supply that exceeds its concentration limit of 10 micrograms per liter (μg/L ).

You would think the established science about the hazards, and the current knowledge about populations at risk, would have the Trump administration actively pursuing public health protections in its desire to make America great again. Instead, we are seeing delays, repeals, and rollbacks.

In November 2016, Obama’s EPA issued a final regulation under the Clean Water Act to limit the amount of these and other toxic metals that power plants can release into public waterways. The EPA noted that due to their close proximity to these waterways and the relatively high consumption of fish, some minority and low-income communities face greater risk. reports that, according to the Water Keeper Alliance, nearly 35% of coal plants discharge toxic pollution within five miles of a downstream community’s drinking water intake and that 81% of coal plants discharge within miles of a public drinking water well.

Flying in the face of solid scientific evidence on the need to protect the public and public waterways from these toxic pollutants and siding with the polluting industry, in April 2017 Trump’s EPA announced it would delay and reconsider this regulation. Mr. Pruitt actually said that the delay would be in the public interest. I’m still trying to get my head around that one.

In a related but somewhat earlier action, Congress beat the EPA to the punch in striking a blow related to these toxic materials. In February, using the Congressional Review Act, our elected representatives repealed the 2016 Department of Interior Stream Protection Rule that would safeguard streams and provide communities with basic information about water pollution related to mountain top removal coal mining.

Read enough?

Sorry that this “snapshot” of a blog has turned into quite a large collection of worrisome examples. Our new report covers even more Trump administration actions that will impact public and worker health—and not in a good way. It also suggests steps we and others can take to hold the administration accountable when it prioritizes public health over bad actors in the private sector.

Most important is that we not lose hope or become inured to or exhausted by what is likely to be an ongoing assault on science, public health, and some of the fundamental elements of our democracy. It will be important to call out and speak out against such actions (and inactions) when you see them.We will do our best to track them, but we also need and value your help. So when you see something, let us know.

The Union of Concerned Scientists is in it for the long haul.

Photo: Petra Bensted/CC BY (Flickr)

Six Selfish Reasons to Communicate Science

UCS Blog - The Equation (text only) -

First, a confession: I never meant to be a science communicator.

I’m an aerospace engineer specializing in fluid dynamics, the physics of how liquids and gases (and granular materials and pretty much anything that’s not a solid) moves. As an undergraduate, I fell in love with the subject in part because of the incredible photos my professors used to help us see and understand how fluids behave. As a PhD student, I was frustrated by how little information there was online for the public to learn about this subject that impacts our daily lives.

From that frustration, my website FYFD was born as a place where I could share the beauty of my subject with the world at large.

An example of flow visualization, a technique physicists and engineers use to understand flows. Here fluorescent dye is painted on a model placed in a wind tunnel to reveal flow patterns. (Photo by NASA.)

Like many scientists, I began communicating science for selfless and altruistic reasons. But along the way, I learned there’s a lot to be gained for the communicator as well. So I’d like to share a few of the selfish reasons to communicate science.

The first one may seem a bit obvious, but engaging in science communication is a great way to hone your communication skills. Whatever path your career leads you down, those skills are key. Communicating science to the public, whether online or through local means, is generally a low-risk operation, but it’s an opportunity to practice and improve your skills so that when it really matters you can nail that job interview or research proposal.

Communicating science regularly can hone your skills for when the big moment arrives. (Comic by Jorge Cham/PhD Comics)

Participating in science communication regularly is also a great way to develop expertise in your subject area. When I started writing FYFD, it seemed like spending part of every day reading journal articles that had nothing to do with my research might be a waste of time. After all, learning the latest on how droplets splash was not going to help my work on high-speed aerodynamics. But toward the end of my PhD—after a few years of writing FYFD—I noticed that when professors and other students had questions that reached beyond our own area, the first resource they turned to was not Google Scholar—it was me.

The first time a professor asked me if I knew anything about the unexpected behavior they were seeing in an experiment, it was a revelation for me. I had unwittingly turned myself into an expert, not simply on the subject of my own research but on fluid dynamics in general. That broad familiarity with the field continues to be valuable today. It allows me to see connections between disparate studies and subjects, a skill that’s key to discovering new avenues for research.

If you choose to use science communication to raise awareness of your own work, it can help you gain exposure. A recent study showed that social media use can help increase a scholar’s scientific impact. It can also help you gain the notice of journalists, and there is evidence that media coverage of papers leads to more citations. Personally, my science communication efforts have almost exclusively highlighted the work of other researchers, but I have nevertheless benefited in terms of networking and new opportunities within my field.

A communicator’s excitement for a subject can galvanize their audience, as seen here when a post about unionid bivalves by the Brain Scoop’s Emily Graslie inspired Tumblr user artsyandnerdy to draw unionid fanart. (Image by artsyandnerdy, used with permission.)

Of course, setting up a Twitter account or a blog is no guarantee that you’ll start seeing your papers in The New York Times. Fortunately, that kind of audience isn’t necessary to see some personal benefits. One of my favorite aspects of science communication—especially in-person—is witnessing a positive-feedback loop of enthusiasm. When you’re genuinely excited about a subject, whether it’s fluid dynamics or unionid bivalves, that enthusiasm impacts your audience and can get them excited. Seeing that excitement in others simply reinforces your own enthusiasm.

Maintaining that reserve of enthusiasm for your subject is vital for motivating yourself when things are going poorly. As an experimentalist in graduate school, I faced a series of setbacks in my research, including spending half of the last year of my PhD rebuilding lab infrastructure instead of gathering data. We all periodically face moments when we ask ourselves: why the heck am I doing this? For me, spending a part of every day searching for a piece of my subject to share with the world was a chance to remind myself of what I love about fluid dynamics. Communicating science is an opportunity to see your field anew and renew your motivation to carry on in spite of the daily frustrations.

As you can see, there’s a lot to be gained, both personally and professionally, from engaging in science communication. If you’d like some resources or guidance on how to begin, UCS is a great place to start. AAAS also offers resources for scientists and your professional society may as well. For guidance to better online science communication, I recommend Science Blogging.

Good luck and remember to have fun!

Nicole Sharp is the creator and editor of FYFD, a fluid dynamics blog with a quarter of a million followers that has been featured by Wired magazine, The New York Times, The Guardian, Science, and others. Nicole earned her M.S. in aerospace engineering from Cornell University and her Ph.D. from Texas A&M University with experiments on the effects of surface roughness on airflow near a surface moving at Mach 6. She currently lives in Denver, Colorado, where she enjoys hiking, cycling, and skiing. You can find her online at @fyfluiddynamics or

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Nuclear Plant Cyber Security

UCS Blog - All Things Nuclear (text only) -

There has been considerable media coverage recently about alleged hacking into computer systems at or for U.S. nuclear power plants. The good news is that the Nuclear Regulatory Commission (NRC) and the nuclear industry are not merely reacting to this news and playing catch-up to the cyber threat. The NRC included cyber security protective measures among the regulatory requirements it imposed on the nuclear industry in the wake of 9/11. The hacking reported to date seems to have involved non-critical systems at nuclear plants as explained below.

The bad news is that there are bad people out there trying to do bad things to good people. We are better protected against cyber attacks than we were 15 years ago, but are not invulnerable to them.

Nuclear Plant Cyber Security History

The NRC has long had regulations in place requiring that nuclear plant owners take steps to protect their facilities from sabotage by a small group of intruders and/or an insider. After 9/11, the NRC issued a series of orders mandating upgrades to the security requirements. An order issued in February 2002 included measures intended to address cyber security vulnerabilities. An order issued in April 2003 established cyber attack characteristics that the NRC required owners to protect against.

The orders imposed regulatory requirements for cyber security on nuclear plant owners. To help the owners better understand the agency’s expectations for what it took to comply with the requirements, the NRC issued NUREG/CR-6847, “Cyber Security Self-Assessment Method for U.S. Nuclear Power Plants,” in October 2004; Regulatory Guide 5.71, “Cyber Security Programs for Nuclear Facilities,” in January 2010; NUREG/CR-7117, “Secure Network Design,” in June 2012; and NUREG/CR-7141, “The U.S. Nuclear Regulatory Commission’s Cyber Security Regulatory Framework for Nuclear Power Reactors,” in November 2014. In parallel, the Nuclear Energy Institute developed NEI-08-09, “Cyber Security Plan for Nuclear Power Reactors,” in April 2010 that the NRC formally endorsed as an acceptable means for conforming to the cyber security regulatory requirements.

First Step: NANA

Anyone who has read more than one report about the U.S. nuclear power industry will appreciate that NANA was a key step in the road to cyber security regulations—Need A New Acronym. The nuclear industry and its regulator need to be able to talk in public without any chance of the public following the conversation, so acronyms are essential elements of the nukespeak. Many FTEs (full-time equivalents, or NRC person-hours) went into the search for the new acronym, but the effort yielded CDA—Critical Digital Assets. It was a perfect choice. Even if one decoded the acronym, the words don’t give away much about what the heck it means.

Finding CDA Among the NCDA, CAA, and NCAA

Armed with the perfect acronym, the next step involved distinguishing CDA from non-critical digital assets (NCDA), critical analog assets (CAA), and non-critical analog assets (NCAA, sorry college sports enthusiasts). Doing so is an easy three-step process.

Step 1: Inventory the Plant’s Digital Assets

The NRC bins the digital assets at a nuclear power plant into the six categories shown in Figure 1. Security systems include the computers that control access to vital areas within the plant, sensors that detect unauthorized entries, and cameras that monitor restricted areas. Business systems include the computers that enable workers to access PDFs of procedures, manuals, and engineering reports. Emergency preparedness systems include the digital equipment used to notify offsite officials of conditions at the plant. Data acquisition systems include sensors monitoring plant parameters and the equipment relaying that information to gauges and indicators in the control room as well as to the plant process computer. Safety systems could include the equipment detecting high temperatures or smoke and automatically initiate fire suppression systems. Control systems include process controllers that govern the operation of the main turbine or regulate the rate of feedwater flow to the steam generators (pressurized water reactors) or reactor pressure vessels (boiling water reactors). The first step has owners inventorying the digital assets at their nuclear power plants.

Fig.1 (Source: Nuclear Regulatory Commission)

Step 2: Screen Out the Non-Critical Systems, Screen in the Critical Systems

Figure 2 illustrates the evaluations performed for the inventory of digital assets assembled in Step 1 to determine which systems are critical. The first decision involves whether the digital asset performs a safety, security, or emergency preparedness (SSEP) function. If not, the evaluation then determines whether the digital asset affects, supports, or protects a critical system. If the answer to any question is yes, the digital asset is a critical system. If all the answers are no, the digital asset is a non-critical system.

Fig. 2 (Source: Nuclear Regulatory Commission)

Step 3: Screen Out the NCDA, Screen in the CDA

Figure 3 illustrates the evaluations performed for the inventory of critical systems identified in Step 2 to determine which are critical digital assets. The first decision involves whether the critical system performs a safety, security, or emergency preparedness (SSEP) function. If not, the evaluation determines whether the critical system affects, supports, or protects a critical asset. If the answer to any question is yes, the critical system is a critical digital asset. If all the answers are no, the critical system is a non-critical digital asset.

Fig. 3 (Source: Nuclear Regulatory Commission)

Remaining Steps

Once the CDAs are identified, the NRC requires that owners use defense-in-depth strategies to protect workers and the public from harm caused by a cyber-based attack. The defense-in-depth protective layers are:

  • Prompt detection and response to a cyber-based attack
  • Mitigating the adverse consequences of a cyber-based attack
  • Restoring CDAs affected by a cyber-based attack
  • Correcting vulnerabilities exploited by a cyber-based attack

The Power of One (Bad Person)

The NRC instituted cyber security regulatory requirements many years ago. The NRC’s inspectors have assessed how effectively measures undertaken by plant owners conform to these requirements. Thus, the U.S. nuclear industry does not have to quickly develop protections against cyber attacks in response to recent reports of hacking and attacking. The job instead is to ensure required protections remain in place as effectively as possible.

Unfortunately, digital technology can also broaden the potential harm caused by an insider. The NRC’s security regulations have long recognized that an insider might attempt sabotage alone or in conjunction with unauthorized intruders. In what the military terms a “force multiplier,” digital technology could enable the insider to attack multiple CDAs. The insider could also supply passwords to the outside bad guys, saving them the trouble of hacking and the risk of detection.

The hacking of computer systems by outsiders made news. The mis-use of CDAs by an insider can make for grim headlines.

We Fact-Checked a Bogus “Study” on Global Temperature That’s Misleading Readers

UCS Blog - The Equation (text only) -

Independent peer-review of scientific research by qualified experts lies at the heart of progress in our understanding of how the natural world works. And posting proposed new scientific findings on the internet without peer-review can lead to some wildly incorrect conclusions being promoted as true.Consider, for example, the document posted on the internet last month by James Wallace III, Joseph D’Aleo and Craig Idso.

In On the Validity of NOAA, NASA and Hadley CRU Global Average Surface Temperature Data, the authors purport to show flaws in the major adjusted global datasets used to track the recent historic increases in Earth’s global average surface temperature arguing that “it is impossible to conclude from the three published GAST data sets that recent years have been the warmest ever –despite current claims of record setting warming.”

Wallace, D’Aleo and Idso offer up to the internet a document that points almost exclusively to dataset sources, ignoring virtually all peer-reviewed studies that examine the issues that they raise.

The authors also take several graphs out of context in an attempt to discredit these well-regarded global average surface temperature datasets that have been assiduously reviewed by multiple independent teams of technical experts and consistently found to be robust.

Normally, it is best to avoid overreacting to unsubstantiated claims about science posted on the internet.

But these are not normal times.

Leading officials of the Trump administration are making false claims about climate science and solutions and using cherry-picked “evidence” to justify their efforts to undermine climate policy.

I asked Rachel Licker, UCS senior climate scientist, to offer a technical review.  Here’s what she wrote:

There were so many egregious errors and unsubstantiated claims in this document, that I cringed with discomfort that this could be mistaken as a peer-reviewed scientific study.

Licker offered a few examples of these errors that taken out of context could be confusing:

Embarrassing Error # 1:

The authors erroneously claim that the NASANOAA, and Hadley CRU global average surface temperature records all produce the same results simply because they use many of the same land-based weather stations as sources. 

These datasets incorporate information from thousands of individual weather stations, ocean measurements and satellite data. Each of these datasets incorporate as many high-quality temperature data sources as possible, including many in common.  Then, each dataset is constructed and analyzed using different methods. Why? Because this is what scientists do to be confident about their results. Scientists test and re-test datasets to see if – using different methods and approaches – they get the same results as their colleagues working independently. I would not want to fly in a plane that had only been inspected once – would you?

Embarrassing Error # 2:

The authors falsely claim that the NASA, NOAA and Hadley CRU GAST records do not properly take into account factors such as urban heat islands and changes in the technologies used to measure land and ocean temperatures over time. They also falsely claim that each of the datasets has selectively biased results in order to exaggerate an upward trend in temperature.

In fact, it is well-established that these datasets do account for these and other factors needed to ensure consistent, comparable and accurate results. Researchers have repeatedly found that the methods used to account for these issues do not affect GAST records to any substantial extent. The size of global surface temperature increases swamps the noise associated with these known and well-studied factors.

Embarrassing Error #3:

The authors cherry-pick some examples in the US as “evidence” that they use to try and refute the well-documented increase in the global average temperature.

Of course, the NOAA, NASA, and CRU datasets include these regional variations. Bottom line: there is a pronounced increase in the global average surface temperature since pre-industrial times and such regional variations are to be expected.

Global Average Surface Temperature (GAST) trend between 1880-2016, annual record. Temperature change is reported relative to the long-term annual average (1910-2000). Source: NOAA

The near-complete lack of references to other scientific studies that examine their spurious claims and the extent to which the authors of this document take information out of context is, quite frankly, embarrassing.

Elected officials have a responsibility to reject such poor quality work, and instead rely on their own cornerstone institutions like the National Academy of Sciences, established by Abraham Lincoln and Congress in 1863, to provide “independent, objective advice to the nation on matters related to science and technology.”

As Ben Santer and colleagues pointed out in a recent Washington Post op-ed, “Only the most robust findings survive peer review and form the basis of today’s scientific consensus.” The peer review process exists to prevent such documents as the one at hand here from making their way into the public domain and being confused with real science.




An Administration Defined by Its Conflicts (and What That Means for Science and Policy)

UCS Blog - The Equation (text only) -

The first six months of President Trump’s time in office have consisted of a whirlwind of questionable governing decisions. From the outset, the Center for Science and Democracy established a baseline of the types of protections for science within the federal government that should be maintained by the Executive Office of the President; to say that the Trump Administration is not up to the mark would be a gross understatement.

Our report released this week, Sidelining Science Since Day One: How the Trump Administration Has Harmed Public Health and Safety in its First Six Months, illustrates the ways in which President Trump and his administration have actively weakened the ability of federal scientists to conduct critical research and issue science-based safeguards; restricted public access to scientific data and information; and rolled back important environmental and public health policies designed to protect clean water, safe workplaces, and wildlife.

What democracy is and isn’t

Since the Trump Administration took office, the balance between public and private control of government has been skewed in favor of industry interests, at the expense of the protections that keep us safe and healthy. Is this means of governance characteristic of a functional democracy?

A democracy at its roots is a government by the people and for the people. The US government was designed to support the public good and the will of all, not the will of a select few.

This conflict between serving all and serving the powerful few is one that all US presidents have faced during their tenure. Franklin D. Roosevelt spoke to Congress in April 1938 about the importance of curbing monopolies, saying:

“The first truth is that the liberty of a democracy is not safe if the people tolerate the growth of private power to a point where it becomes stronger than their democratic state itself. The second truth is that the liberty of a democracy is not safe if its business system does not provide employment and produce and distribute goods in such a way as to sustain an acceptable standard of living.”

He continued:

“We believe in a way of living in which political democracy and free private enterprise for profit should serve and protect each other—to ensure a maximum of human liberty not for a few but for all.”

Photo: UCS/John Finch

Strong science-based public protections and a thriving business community are not mutually exclusive. Yet, under the Trump Administration, important safeguards are being recklessly cut in the name of efficiency. In employing the strategic counsel from former industry lobbyists and choosing to roll back, delay, or otherwise weaken rules that curb the ability of these companies to pollute and harm us, the Trump Administration is acting to undermine the role of evidence, reason, and the will of the public within this democracy we have fought so hard to institute and maintain.

What ever happened to draining the swamp?

The Director of the Office of Government Ethics (OGE), Walter Shaub, resigned earlier this month. In his resignation letter, he wrote, “The great privilege and honor of my career has been to lead OGE’s staff and the community of ethics officials in the federal executive branch. They are committed to protecting that public service is a public trust, requiring employees to place loyalty to the Constitution, the laws, and ethical principles above private gain.”

As its name suggests, The Office of Government Ethics is responsible for preventing conflicts of interest within the federal government. Shaub told the New York Times that there wasn’t much more he could accomplish in the office “given the current situation,” likely responding to the way in which President Trump has taken office with an unprecedented list of personal financial conflicts and a cabinet with extensive industry ties, despite campaigning on a promise to drain the proverbial DC swamp of corporate lobbyists.

In President Trump’s first six months, several individuals with strong industry ties were nominated and confirmed for key agency leadership positions; the President has issued legally questionable executive orders, one of which requires that agencies repeal two regulations for everyone that it issues (with the intent of freezing regulation and allowing industry to get away with business as usual); created deregulation teams at each federal agency, many of whom have deep industry ties; signed into law several bills aimed at nullifying Obama-era regulations that would prevent industry misconduct; and delayed a handful of science-based rules.

In sum, the Trump administration is disproportionately considering the policy agenda of the private sector, while neglecting the real-life implications of allowing corporations to operate without proper checks.

The trump administration’s decisions are impacting peoples’ lives

Despite a wealth of scientific evidence linking chlorpyrifos with neurotoxic impacts on children and adults, the pesticide is still used on corn, soybeans, fruit and nut trees, certain vegetables including Brussels sprouts and broccoli, and other crops. Photo: USDA/Flickr

The clear conflicts of interest of the President and his cabinet, combined with the lack of transparency that has thus far been characteristic of the administration, have created large vulnerabilities for science-based policy in this administration. While industry awaits its chance to profit from regulatory rollbacks both here and abroad, a regulatory freeze, and industry-friendly cabinet appointments, the rest of us are missing out on unrealized health and safety benefits.

As the Environmental Protection Agency (EPA) stalls in making a decision to ban the use of the pesticide, chlorpyrifos, farmworkers in California were getting sick from acute exposure to the chemical used according to its label instructions.

As the EPA delays implementation of the Risk Management Plan (RMP) rule, communities like those surrounding Houston, Texas will have to wait even longer for much-needed protections from chemical facilities.

Yudith Nieto from Manchester, Texas was among the members of the public who made a public comment at EPA’s hearing in June regarding the proposed delay of the rule. She told agency officials:

“We have over a thousand refineries in the Houston area, as you may know, and I think you’ve probably heard throughout the day, I am sure of people that reported the work that they have been doing, the research work, the legal work that they are doing to support communities like  the community of Manchester, all around the country. And we depend on these kinds of rules and regulations to try to protect our communities, our children, people who live right next to refineries, pipelines, tank farms, and other exposures. With everything that’s happening right now in the country, we fear more for our lives, we fear or our livelihoods, we fear for our health, because we see that a lot of these Rules are being attacked.”

As the Department of Labor stalls implementing stricter exposure standards for silica and beryllium rules, the labor force in metal foundries and construction sites will continue to be exposed to levels of those chemicals that could result in eventual silicosis or chronic beryllium diseases.

Eddie Mallon worked in New York City tunnel construction for over forty years before being diagnosed with silicosis. When asked how the disease has changed his life at a 2014 hearing for the rule, he told OSHA staff:

“Well, yeah, I loved playing sports, but I can’t do that with the silicosis. It’s a problem. I like working around the house. It affects me. Playing with my grandchildren affects me. It has all different ways to affect you. I mean, you know, I’m 70. I may look a little younger than what I look, but my lungs, I’m sure, are a lot older than that. It’s a hidden disease, silicosis.”

He worries about the next generation without a stronger exposure standard:

“I strongly believe OSHA needs to implement strong silica standards. We need them and especially in my industry. I do believe that 50 years down the line, you will never see a 70-year-old man sitting up here talking about silica if things don’t change. These young kids today, what they’re going to face down the hole is nothing [like] what I did. For the first 30 years of working the tunnels, it was bad with the silica, but nowhere near what it’s like today.”

And as the Food and Drug Administration gives food manufacturers more time before enforcing revised nutrition facts labels with a line for added sugars, parents will remain in the dark on how much sugar is added to their children’s food. Alec Bourgeois, a DC dad, shares what the delays mean for his family:

“You can think that you are making reasonable, healthy decisions and realize later that you are absolutely eating junk and it’s incredibly frustrating…So much of the things that didn’t used to have sugar in them now have sugar in them. Like tomato sauce for example. There’s going to be some sugar in the tomato but they are also adding five or six tablespoons of sugar on top of that.”

While we’ve had some victories in stopping the Administration from making ill-advised policy decisions, we need to continue to monitor the horizon and fight back when science is attacked, misused, or ignored by our government.

Sign up here to join the vanguard of watchdogs poised to help us with this mighty task, and to read more about how the Administration has sidelined science so that powerful interests can push forward their own agendas, check out our new report here.

Economist to Team Trump: More Trade Won’t Avert a Farm Crisis

UCS Blog - The Equation (text only) -

Whether or not you think “Made In America Week” is a hypocritical joke, it seems like a good time to assess the Trump administration’s plan to sell more home-grown farm products abroad. Earlier this month, senior administration officials tucked into prime rib in Beijing to celebrate the re-opening of China to US beef after 14 years. Despite a rather incoherent overall trade strategy, when it comes to exporting corn and beef, the administration has been bullish (pun intended). Secretary of Agriculture Sonny Perdue has said we can “sell our way out” of a crisis with exports, and he recently reorganized the USDA to add an Undersecretary for Trade who will “wake up each and every day” thinking about how to unload more US farm products overseas. (The president just this week nominated Indiana Agriculture Director Ted McKinney to fill that post.)

On the face of it, new and expanded global markets might seem like a way out for US farmers suffering from low commodity prices and declining farm incomes. But will it work? I walked down the hall to ask an expert.

Kranti Mulik is our resident agricultural economist here at the Union of Concerned Scientists, and her bio includes stints researching global agricultural trade issues at Iowa State University, IHS Global Insight, North Dakota State University, and Kansas State University. Following is my recent Q&A session with Kranti to better understand how much we can expect US farmers’ prospects to improve through increasing exports of major farm commodities.

KPS: For starters, just how bad is the economic picture for US farmers right now anyway?

KM: Farmers are in a bind—commodity prices are low and we have an oversupply issue. Grain storage bins are full and farmers are trying to find a way to get rid of their supply. In addition, net farm incomes are expected to decline for the fourth consecutive year and farm debt is also expected to rise by over 5 percent. And the entire northern hemisphere is dealing with overproduction. As a result, global farm commodity prices are low and the latest forecasts indicate that they will remain low compared to previous highs. In addition, demand for agricultural commodities is also expected to slow down as countries like China slow their spending.

In response to all of this, our friends at the National Farmers Union—who represent 200,000 family farmers across the United States—are sounding the alarm about a full-blown farm crisis.

KPS: That’s bad. But we’ve heard that as China’s middle class grows, it will be hungry for more meat and poultry products from the United States. Is that a given?

KM: It’s true that China’s economy and its middle class are growing, and that more affluent consumers demand more meat. However, GDP growth in China has leveled off (as shown in this graph), and we won’t see growth rates there or in other developing countries like we saw 10 years ago. In addition, Chinese pork demand has peaked, and is now declining as diets get healthier. Pork sales hit a three-year low last year at 40.85 million tons (down from 42.49 million tons in 2014), and it is predicted that they will fall slightly this year as well. New Chinese government dietary guidelines also aim to reduce that country’s meat consumption by 50 percent. And as I mentioned before, China is also expected to slow its spending.

So is Chinese meat demand really expected to grow that much going forward? Maybe not.

Also, trade decisions really depend on a combination of factors, not just GDP growth but also exchange rates and consumer preferences for particular products. The US dollar is strong now (too strong, according to President Trump), which makes our exports more expensive abroad. Moreover, South American countries now have advantage over the US in terms of lower cost of production, and those countries are expanding their cattle inventories. So it’s harder for US farmers to compete with cheaper beef coming from Brazil and Uruguay, for example. And to some degree, beef is beef for Chinese consumers, and the government is likely to import the cheaper product.

Overall, I think it’s irresponsible to suggest to farmers that increasing exports is some kind of silver bullet.

KPS: Assuming for the moment that US exports of farm commodities could be increased significantly, would that necessarily benefit the average American farmer?

KM: Exports are already a big part of total US farm income, and increased trade isn’t inherently bad. But right now our exports are dominated by grains and oilseeds (think corn, soybeans, and cottonseed), along with meat produced from animals fed those products in industrial facilities. That’s what Secretary Perdue is suggesting we sell more of abroad. But that means doubling down on a large-scale vertically-integrated production system that is already failing most US farmers. This system primarily benefits big agribusiness companies like Cargill and Tyson. And global trade in these kinds of undifferentiated commodities rarely helps the little guy very much.

And today, beginning farmers operate 20 percent of US farms. These mostly younger farmers want to move away from the industrial model, and they won’t be helped much by Secretary Perdue’s trade strategy.

KPS: But isn’t a free market and more trade good for everyone?

KM: Not necessarily. The US export strategy has long been a matter of using world markets to sell commodities—like corn—that we’re producing too much of already. And a major reason our farmers are producing too much in the first place is that a handful of commodities have long been subsidized by federal farm policies. Farm subsidies between 1995 and 2014 totaled $322 billion, and most of this has supported five commodity crops—corn, soybeans, wheat, cotton, and rice—with corn alone receiving more than $94 billion in subsidies over that period.

So when we sell a lot of subsidized commodities, corn for example (or corn-fed beef or pork) globally, it drives down world food prices. And farmers in other countries—where they don’t get big subsidies—can’t compete. That’s really bad for farmers in developing countries. And when farmers in those countries suffer, the whole population suffers. We’re talking about some of the poorest people in the world here.

KPS: What other effects might we see from boosting production of US farm commodities for export?

KM: It would almost certainly worsen the environmental impact of US agriculture, which is already a big problem. My recent research has shown how our policies that subsidize a few farm commodities have contributed to a massive water pollution problem. By encouraging Midwestern farmers, for example, to maximize production of corn and soybeans, these policies have created a vast monoculture with lots of added nitrogen fertilizer and bare soil in between crops, leading to erosion and runoff that has harmful downstream impacts. The national price tag for nitrogen pollution from farms is already $157 billion a year—more than double the value of the entire 2011 US corn harvest. Expanding corn production, especially if it’s done in less-productive, more-erodible areas, will make this problem worse.

More industrial agriculture probably won’t be good for public health, either. A 2014 Harvard study found that ammonia emissions from exporting livestock and commodity crops resulted in public healthcare costs of $36 billion and 5,100 premature deaths.

KPS: Based on your research, what are the most promising strategies—trade or otherwise—for helping American farmers and rural communities out of their current economic slump?

KM: As I said, there’s nothing inherently wrong with trade in farm products. But it’s interesting that while the United States has always been an agricultural exporter, our food and farm trade balance has been declining. We are now importing more than we’re exporting. Of course, farmers in the Midwest can’t meet local consumer demand for bananas and coffee, but there are plenty of crops they could grow.

For example, we now import oats from Canada and Sweden, and my most recent report shows we could grow more of those in the Midwest, and that would benefit the region’s farmers and its environment. Rather than continuing to focus on single-commodity exports, we should use public policies and incentives to encourage farmers to grow a wider variety of higher-value, differentiated products that would find ready markets both abroad and right here at home.

I mentioned beginning farmers earlier. They’re the future of agriculture, but they face various challenges—the primary one being access to land, which has becoming increasingly difficult. In addition, the major focus of the government subsidies and research is still on commodity crops, making it challenging for farmers who want to grow so-called specialty crops (fruits and vegetables). With 30 percent of principal farm operators over the age of 65, we need more beginning and young farmers to enter the work force and we need to support these farmers as they are critical to our rural economies.

I’ve studied the economic benefits that would come from farmers in states like Iowa growing fewer commodities and more fruits, vegetables, and other foods for local consumption. I found that if public policies did more to connect small and midsize farmers there with large-scale food local buyers such as supermarkets and hospitals, it would create jobs, revitalize rural communities and improve access to healthy food all at once.

40% growth? The Latest Electric Vehicle Sales Numbers Look Good

UCS Blog - The Equation (text only) -

US electric vehicle (EV) sales are up 45% for the twelve-month period from July 2016 through June 2017, compared to the prior twelve-month period. What does that mean for the future?

As I’ve noted previously, the US EV market saw 32% annual growth over 2012-2016. This rate would, if continued, result in EVs being 10% of all new car sales in 2025.

For perspective on this target: according to UCS analysis, California’s Zero-Emission Vehicle (ZEV) program would result in about 8% of California’s vehicles being zero-emissions (mostly electric) by 2025. California leads the nation in EV market penetration by quite a bit. According to the International Council on Clean Transportation, nearly 4% of California’s light-duty vehicle sales in 2016 were EVs, compared to less than 1% for the country as a whole. And this was without major automakers Honda and Toyota offering a plug-in vehicle in that year. Sixteen cities in the state already see EVs exceeding 10% of vehicle sales.

California has achieved this through a mixture of policy, infrastructure, consumer awareness and interest (although the Northeast is not far behind on that count), and automaker efforts. Seen in that light, the entire country reaching 10% EV sales in 2025 would be pretty good.

But what if the market were actually hitting a “tipping point” such that this recent growth could continue? If a 40% growth rate could be sustained for the next six years, then we would see EVs reach 10% of US vehicle sales in 2023, and possibly near 20% by 2025. Cost reductions from technology improvements and economies of scale would help sustain the growth rates, as well as expanded charging infrastructure.

What are people buying?

The Tesla Model S was the top seller both in June and year-to-date. This is an all-electric vehicle with a range of 249-335 miles, depending on the configuration (the 60 kWh versions, with ranges of 210-218 miles, were recently discontinued).

Figure 1: Tesla Model S. Source:

Plug-in hybrids are proving quite popular, as the #2 vehicle year-to-date is the Chevy Volt, and the #3 is the Prius Prime.

Figure 2: Chevy Volt. Source:

The Volt, with a 53-mile all-electric range in the 2017 model, is a well-established mainstay by the standards of this young market. It has been a consistent top seller since its introduction in December 2010.

Figure 3: Toyota Prius Prime. Source:

The Prius Prime is a new market entrant that was the May sales champion. It has a 25-mile electric-only range, so it could likely do most daily driving in all-electric mode if workplace charging were available (even a standard wall outlet would replenish the battery in 8 hours). Plug-in hybrids have a gasoline engine if needed for longer drives, but I’ve heard that drivers of these vehicles tend to keep their batteries topped off to do as much driving in electric mode as possible. If you don’t yet drive an EV, you might not realize the extent of the existing charging infrastructure, but it’s out there; Plugshare is a great resource.

Tesla’s Model X crossover SUV is the #4 vehicle year-to-date, while Chevy’s new all-electric Bolt, with its 238-mile range, rounds out the top 5 (the Nissan LEAF is just behind the Bolt). The top five models make up just over half the market, with a long list of other products also selling in the United States.

What’s missing?

Given the market strength of the newcomer Prius Prime, what other new vehicles might take a turn at the top of the sales charts in the months ahead?

Well, there are a number of other new models from Kia, Chrysler, Cadillac, Volkswagen, and others. Certainly, the Tesla Model 3, with its first vehicles shipped in July, looks to be a contender. There are over 400,000 reservations for the vehicles worldwide, so it could easily become the sales champion if Tesla can ramp up production quickly enough. But in years to come, we might see something very different.

There is one category notably lacking among US EVs sales: the pickup truck. The best-selling light-duty vehicle in the US has for 35 years been the Ford F-series, with 820,799 units sold in 2016 (this is more than double the sales of the top-selling car in 2016, the Toyota Camry).

Figure 4: Ford F-150. Source:

Some companies perform aftermarket conversions to turn trucks into plug-in hybrids, and others have announced plans to build brand-new electric pickup trucks (such as Tesla, Via, Havelaar, and Workhorse). Trucks have a wide range of needs and duty cycles, and not all applications would be suited to electrification at present. There are definitely engineering challenges to resolve.

Still, a plug-in version of the F-150 could serve the needs of many owners, and could propel Ford to the top of the EV sales charts. This is not in Ford’s plans at the moment (although a basic hybrid F-150 is), but what if the company experiences positive results from its other electric and plug-in products? Might we see an electric F-150? Or would the Chevy Silverado or Dodge Ram (the #2 and #3 selling vehicles in 2016) have plug-in versions first?

The pickup truck market is too big to ignore. As battery technology continues to improve, it should become easier to make electrification work for at least part of this segment.

What’s next?

Typically, the second half of the year sees higher sales volume, with December being the biggest month. It should be particularly interesting to watch the growth of Tesla’s Model 3 production over the next six months. News items such as the new study from Bloomberg, Volkswagen’s investments in charging infrastructure, and other developments may heighten public interest in EVs generally.

The most effective means of raising consumer awareness of and interest in EVs are ride-and-drive events. If you haven’t tried one out yet, look for an event near you during Drive Electric Week!

The Trump Administration’s Record on Science Six Months after Inauguration

UCS Blog - The Equation (text only) -

To address unsolved questions, scientists develop experiments, collect data and then look for patterns. Our predictions of natural phenomena become more powerful over time as evidence builds within the scientific community that the same pattern appears over and over again. So, when the 2016 presidential candidates began speaking out about their positions on science policy, the scientific community was listening, collecting data, and looking for patterns.

In particular, candidate Donald Trump’s positions on space exploration, climate change science, and vaccines sent a chilling and frightening signal to the scientific community of what science policy might look like under a President Trump. We no longer have to wonder if candidate Trump’s positions on science policy would be indicative of President Trump’s positions, as we now have six months of data on the Trump administration’s science policy decisions.

Today, we release a report on President Trump’s six month record on science. In this report, we present evidence of patterns the President is using to systemically diminish the role of science in government decision making and in people’s lives. In its first six months, the Trump administration has sidelined independent science advice, placed profits over public protections, and reduced public access to government science and scientists.


Sidelining independent science advice

In the first six months of the Trump administration, senior level officials have misrepresented or disregarded scientific evidence even when such evidence has been pertinent to policy decisions. For example, EPA Administrator Scott Pruitt refused to ban the use of the pesticide chlorpyrifos even though the science provides evidence that this chemical affects neurological development in children. The administration also has circumvented advice from scientific experts outside the agency by dismissing experts from agency science advisory boards. For example, in April, Attorney General Jeff Sessions ended support for the Department of Justice’s National Commission on Forensic Science. The administration also has clearly dismissed years of research showing that climate change is primarily caused by humans and is affecting public health now. Additionally, President Trump has left many scientific leadership positions in the federal government vacant. Where President Trump has appointed someone to a science leadership position, those individuals have largely come from the industries they are now in charge of regulating.

Placing profits over public protections

When science is disregarded on decisions where scientific evidence is vital, one can logically question the basis of that decision. And that void can be filled by inappropriate influences. The Trump administration, aided and abetted by Congress, is displaying a clear pattern of disregarding science to benefit priorities of powerful interests at the expense of the public’s health and safety. In part, to accomplish this, the Trump administration quickly turned to a rarely used tool of Congress, the Congressional Review Act (CRA). The CRA allows Congress to render regulations issued within 60 days of the end of the House or Senate sessions null and void. Since its enactment in 1996, the tool had only been used once—the Trump administration has used it 14 times!

One of the regulations nullified, the stream protection rule, was intended to keep communities’ drinking water clean where mountaintop coal mining occurs. The Department of Interior had put this rule in place based on scientific evidence that there is a causal link between higher rates of birth defects, cancer, and cardiovascular and respiratory diseases in communities nearby areas where mountaintop coal mining occurs. As my colleague and co-author of the report Genna Reed revealed, two representatives who sponsored this CRA legislation, Bill Johnson of Ohio and Evan Jenkins of West Virginia, received over $1 million in political contributions from the mining industry, and echoed talking points from the National Mining Association and Murray Energy Company in their statements of support for the rule’s repeal. The CEO of Murray Energy Company also was invited to watch President Trump sign the CRA resolution into law.

Countless other examples like this exist under this administration regarding the rollback of policies related to climate change, vehicle fuel economy standards, ozone pollution, and chemical safety to name a few. In fact, the White House is boasting about rolling back many of these regulations. Apparently, removing protections that safeguard children from harmful neurological effects and that protect disadvantaged communities from getting cancer are things that our administration applauds nowadays.

Reducing public access to government science and scientists

While there are valid reasons why the government keeps some information sensitive or classified, usually there is no such valid reason why science cannot be communicated openly. Yet, the Trump administration has been actively working to reduce public access to scientists and their work. For example, many government webpages have now been altered or removed, particularly those that focus on climate change. The Trump administration also has retracted questions from surveys intended to support disadvantaged communities.

Additionally, scientists in federal agencies have been restricted from communicating their work to anyone outside of the agency, and also have been barred from attending and presenting at scientific conferences. Yesterday, Joel Clement, former Director of the Office of Policy Analysis at the Interior Department, blew the whistle on the Trump administration for their attempts to silence his work to help endangered communities in Alaska prepare for climate change by reassigning him to a position in accounting. As Clement rightfully points out, removing a scientist from their area of expertise and placing them in a position where their experience is not relevant is “a colossal waste of taxpayer money.” The public has the right to access government science and to hear from the scientists that produce it.

The attacks on science keep rolling in

The examples that I’ve highlighted in this blog entry are merely a smattering of the attacks on science discussed in our report. All of these attacks are happening at the same time that the President has proposed deep cuts to scientific agencies and funding for basic research, sending a signal to scientists that their work is not valued. Senator Bill Nelson of Florida recently took to the floor to call for an end to the “blatant, coordinated effort by some elected officials to muzzle the scientific community.” It is becoming difficult to suggest that a war on science doesn’t exist when evidence is piling up, and suggests that the Trump administration intends to silence science and scientists wherever and whenever possible.

We cannot retreat from progress that the use of science in decision making allows us to make: more children living a healthy life without asthma, a number of lives spared due to vaccinations, the protection of America’s endangered wildlife. Scientists and science supporters are already speaking up and taking to the streets to march, to advocate for the use of science in decision making. We can resist the Trump administration’s attacks on science—our democracy gives us the right to do so.

Environmental Injustice in the Early Days of the Trump Administration

UCS Blog - The Equation (text only) -

When the EPA was established in 1970 by Richard Nixon, there was no mandate to examine why toxic landfills were more often placed near low-income, Black, Latino, immigrant, and Native American communities than in more affluent, white neighborhoods. Nor was there much recognition that communities closer to toxic landfills, refineries, and industrial plants often experienced higher rates of toxics-related illnesses, like cancer and asthma.

Yet these phenomena were palpable to those living in affected communities. In the 1970s and 80s, local anti-toxics campaigns joined forces with seasoned activists from the civil rights movement, labor unions, and with public health professionals and scientists, drawing attention to the unevenly distributed impacts of toxic pollution, and forming what we now recognize as the environmental justice movement.

The new administration has mounted a swift and concerted attack on the federal capacity and duty to research, monitor, and regulate harmful pollutants that disproportionately affect women, children, low-income communities, and communities of color.  Two examples demonstrate the potential consequences: overturning the ban on chlorpyrifos, and a variety of actions that reduce collection of and public access to the data on which environmental justice claims depend.

Overturning the ban on chlorpyrifos

EPA Administrator Scott Pruitt overturned the chlorpyrifos ban, despite the fact that EPA scientists recommended that the pesticide be banned because of the risks it posed to children’s developing brains. Photo: Zeynel Cebeci/CC BY-SA 4.0 (Wikimedia Commons)

Chlorpyrifos is a commonly used pesticide. EPA scientists found a link between neurological disorders, memory decline and learning disabilities in children exposed to chlorpyrifos through diet, and recommended in 2015 that the pesticide be banned from agricultural use because of the risks it posed to children’s developing brains.

Over 62% of farmworkers in the U.S. work with vegetables, fruits and nuts, and other specialty crops on which chlorpyrifos is often used.  These agricultural workers are predominantly immigrants from Mexico and Central America, living under the poverty line and in close proximity to the fields they tend. A series of studies in the 1990s and 2000s found that concentrations of chlorpyrifos were elevated in agricultural workers’ homes more than ¼ mile from farmland, and chlorpyrifos residues were detected on work boots and hands of many agricultural worker families but not on nearby non-agricultural families.

In March 2017, EPA Administrator Scott Pruitt publicly rejected the scientific findings from his agency’s own scientists and overturned the chlorpyrifos ban, demonstrating the Trump administration’s disregard for the wellbeing of immigrant and minority populations. Farmworker families could be impacted for generations through exposure to these and other harmful pesticides.

Limiting collection of and access to environmental data

Because inequitable risk to systematically disadvantaged communities must be empirically proven, publicly available data on toxic emissions and health issues are crucial to environmental justice work. The Trump administration has already taken a number of actions that limit the collection and accessibility of data necessary to make arguments about environmental injustices that persist through time in particular communities.

Houston has a number of chemical plants in close proximity to low-income neighborhoods. Photo: Roy Luck/CC BY 2.0 (Flickr)

Workers, especially those laboring in facilities that refine, store or manufacture with toxic chemicals, bear inequitable risk. The Trump administration has sought to curb requirements and publicity about workplace risks, injuries and deaths. For example, President Trump signed off on a congressional repeal of the Fair Pay and Safe Workplaces rule, which required applicants for governmental contracts to disclose violations of labor laws, including those protecting safety and health. Without the data provided by this rule, federal funds can now support companies with the worst worker rights and protection records. President Trump also approved the congressional repeal of a rule formalizing the Occupational Safety and Health Administration’s (OSHA) long-standing practice of requiring businesses to keep a minimum of five years of records on occupational injuries and accidents.  While five years of record-keeping had illuminated persistent patterns of danger and pointed to more effective solutions, now only six months of records are required. This change makes it nearly impossible for OSHA to effectively identify ongoing workplace conditions that are unsafe or even life-threatening.

Another example is the administration’ proposed elimination of the Integrated Risk Information System, or IRIS, a program that provides toxicological assessments of environmental contaminants. The IRIS database provides important information for communities located near plants and industrial sites that produce toxic waste, both to promote awareness of the issues and safety procedures and as a basis for advocacy. These communities, such as Hinkley, CA, where Erin Brockovich investigated Pacific Gas and Electric Company’s dumping of hexavalent chromium into the local water supply, are disproportionately low income.

Responding to Trump: Developing environmental data justice

Data is not inherently good.  It can be used to produce ignorance and doubt, as in the tactics employed by the tobacco industry and climate change deniers.  It can also be used to oppressive ends, as in the administration’s collection of information on voter fraud, a phenomenon that is widely dismissed as non-existent by experts across the political spectrum.  Further, even the data collection infrastructure in place under the Obama administration failed to address many environmental injustices, such as the lead pollution in Flint, MI.  Thus we would argue that promoting environmental data justice is not simply about better protecting existing data, but also about rethinking the questions we ask, the data we collect, and who gathers it in order to be sure environmental regulation protects all of us.


Britt Paris is an EDGI researcher focused on environmental data justice. She is also a doctoral student in the Department of Information Studies at UCLA, and has published work on internet infrastructure projects, search applications, digital labor and police officer involved homicide data evaluated through the theoretical lenses of critical informatics, critical data studies, philosophy of technology and information ethics.

Rebecca Lave is a co-founder of EDGI (the Environmental Data and Governance Initiative), an international network of academics and environmental professionals that advocates for evidence-based environmental policy and robust, democratic scientific data governance. She was the initial coordinator of EDGI’s website tracking work, and now leads their publication initiatives. Rebecca is also a professor in the Geography Department at Indiana University.

 Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Pesticide Action Network

The National Flood Insurance Program is up for Reauthorization: Here’s What Congress Should Do

UCS Blog - The Equation (text only) -

The National Flood Insurance Program is up for reauthorization by the end of September and the clock is ticking for legislation to extend the program. With so many homeowners and small businesses depending on this vital program, will Congress take the necessary steps to reform and strengthen the program—especially in light of the growing risks of coastal and inland flooding?

Here’s a quick rundown of the latest bills and what they might mean for the future of the program.

The NFIP is in urgent need of reform

With about 5 million policyholders nationwide (see map), the NFIP is vital for homeowners—especially for those who live in flood-prone areas inland or along the coasts. You or someone you know is likely one of these homeowners; perhaps you’ve spend some time this summer in a beachfront community that participates in the NFIP. Reauthorizing the program is critical to protect such homeowners.

Source: FEMA

But the NFIP is also urgently in need of reforms to help put the program on a healthy financial footing and ensure that it encourages climate-smart choices. The program has been on the GAO’s High Risk list since 2006 and is over $24 billion in debt.

For quick refresher on the NFIP, see my earlier blogpost highlighting five ways to improve the program to promote climate resilience:

  • Update flood risk maps using the latest technology and to reflect the latest science
  • Phase in risk-based pricing and broaden the insurance base
  • Address affordability considerations for low and moderate income households
  • Provide more resources for homeowners and communities to reduce their flood risks by making investments ahead of time
  • Ensure that a well-regulated private sector flood insurance market complements the NFIP without undermining it
Action in the House

The House Financial Services Committee passed a package of 7 bills last month to reauthorize the NFIP and make changes to the program. These included: H.R. 2874, the 21st Century Flood Reform Act; H.R. 2868, the National Flood Insurance Program Policyholder Protection Act of 2017; HR 2875, the National Flood Insurance Program Administrative Reform Act of 2017; HR 1558, the Repeatedly Flooded Communities Preparation Act; HR 1422, the Flood Insurance Market Parity and Modernization Act; HR 2246, the Taxpayer Exposure Mitigation Act of 2017; and HR 2565, a bill to require the use of replacement cost value in determining the premium rates for flood insurance coverage under the National Flood Insurance Act, and for other purposes.

However, last week 26 House Republicans indicated to Speaker Ryan that they could not support the package in its current form. This means that Chairman Hensarling will have to work on further changes before a bill is brought to the Floor for a vote.

A major priority for Chairman Hensarling is ensuring that taxpayer interests are protected as the NFIP is extended. In his opening remarks last month he said:

“There are so many important voices in our debate today on the reauthorization of the National Flood Insurance Program… But as far as I’m concerned, perhaps the single-most important voice is the voice that remains underrepresented in the debate and that is the voice of the American taxpayer.”

Meanwhile Rep. Maxine Waters, the ranking Democrat on the committee, has expressed concern that the package of bills might actually “make matters worse by restricting coverage, increasing costs, and opening the door to cherry-picking by the private sector.”

Action in the Senate

In the Senate, there are several bipartisan bills that show that agreement is possible in several key areas, although there remain some important differences between the bills on how best to balance competing priorities. Work remains to help reconcile these bills.

Senators Cassidy (R-LA) and Gillibrand (D-NY) introduced S.1313 – Flood Insurance Affordability and Sustainability Act of 2017 last month. Senators Capito (R-WV) and Kennedy (R-LA) have also joined as co-sponsors of this bill. The bill seeks to extend the program for 10 years, increase funding for pre-disaster and flood mitigation assistance programs, take steps to address affordability concerns, preserve funding for flood mapping, and enhance the role of the private sector in the flood insurance market.

Separately, six senators—Senators Bob Menendez (D-NJ), John Kennedy (R-LA), Chris Van Hollen (D-MD), Marco Rubio (R-FL), Elizabeth Warren (D-MA), Thad Cochran (R-MS), Cory Booker (D-NJ) and Bill Nelson (D-FL)—have cosponsored the Sustainable, Affordable, Fair and Efficient National Flood Insurance Program Reauthorization Act (SAFE NFIP) 2017. This bill extends the program for 6 years, includes mean-tested affordability provisions, enhances funding for flood mitigation assistance and the pre-disaster hazard mitigation grant program, authorizes $800 million per year for 6 years for LiDAR mapping, eliminates interest payments on the NFIP’s debt and caps commissions on Write-Your-Own policies.

Last month Senators Scott (R-SC) and Schatz (D-HI) introduced S.1445 – Repeatedly Flooded Communities Preparation Act which directs communities with properties that have repeatedly flooded to develop submit and implement a community-specific plan for mitigating continuing flood risks. This is similar to a companion House bill mentioned above, H.R. 1558, which was introduced by Rep. Royce (R-CA).

Earlier this week, Senators Crapo and Brown introduced the National Flood Insurance Program Reauthorization Act of 2017. The bill extends the NFIP for six years, directs communities with significant numbers of repetitive loss properties to develop mitigation plans, provides funding for pre-disaster mitigation, preserves funding for updated flood mapping and has provisions to encourage flood risk disclosure.

(Interestingly, the bill also has an extensive section on funding for wildfires, long a priority for Senator Crapo.  It adds “wildfires on federal lands” in the definition of “major disasters” under the Stafford Act, which would allow funding from the Disaster Relief Fund to be made available to the Department of Interior or the USDA for fire suppression operations.)

The Nation needs a robust NFIP

Flood risks are growing along our coasts because of sea level rise and inland because of an increase in heavy rainfall. At the same time, growing development in floodplains is putting more people and property in harm’s way. Importantly, climate change will increase flood risks in many parts of the country, regardless of whether homeowners purchase flood insurance through the private market or the taxpayer-backed NFIP.

Last week UCS released a study showing that many coastal communities experience chronic inundation already, with hundreds more at risk by mid-century.

A recent study from Zillow highlights the long term risks our nation faces from sea level rise. It finds that:

Nationwide, almost 1.9 million homes (or roughly 2 percent of all US homes)—worth a combined $882 billion—are at risk of being underwater by 2100. And in some states, the fraction of properties at risk of being underwater is alarmingly high. More than 1 in 8 properties in Florida are in an area expected to be underwater if sea levels rise by six feet, representing more than $400 billion dollars in current housing value.

Flood risk maps in many parts of the country are outdated, inadequate or non-existent, and even the latest maps do not include sea level rise projections. This means that communities and local planners often do not have a clear understanding of their true flood risks, now and into the future. Updating these maps using the latest technology is costly and Congress will need to authorize adequate funding for this. FEMA’s Risk Mapping, Assessment and Planning (Risk MAP) program is an important cornerstone of these efforts.

Investing in flood mitigation measures ahead of time is a smart way to keep risks and costs down. That’s why Congress must beef up funding for FEMA’s Flood Mitigation Assistance and Pre Disaster Mitigation Grant Programs, alongside extending the NFIP. Prioritizing investments in nature-based protective measures, such as preserving wetlands, are also a very important way to help safeguard communities. In flood-prone areas with properties that get repeatedly flooded, expanding funding for voluntary home buyouts is vital so that homeowners have real options to move to safer ground. Coordinated federal, state and local actions are the best way to reduce flood risks to communities.

Finally, NFIP reforms must include affordability provisions to help low and moderate income homeowners. This could include means-tested vouchers or rebates on premiums and low interest loans or grants for flood mitigation measures and other provisions outlined in recent NAS reports.

Know your flood risk

If you’re a home owner, it’s a smart idea to know your flood risk and how it might change over time. Here are some resources to get you started:

Time for Congress to Act

There’s no shortage of bills in Congress to reauthorize and reform the NFIP. Now Congress needs to work toward reaching bipartisan agreement on robust legislation by September 30. There’s no excuse for delay or inaction—homeowners around the country are counting on a strong, fair and effective flood insurance program to keep them safe.

The Wall Street Journal Gets it Wrong on EPA Scientific Integrity…Again

UCS Blog - The Equation (text only) -

The Wall Street Journal ran an opinion piece yesterday titled “A Step Toward Scientific Integrity at the EPA” written by long-time critic of the EPA and purveyor of anti-science nonsense, Steven Milloy. His piece commends Administrator Pruitt on his recent dismissals of EPA advisory committee members, and questions the independence of advisory committees, like the EPA’s Science Advisory Board (SAB) and Clean Air Scientific Advisory Committee (CASAC), claiming that they contain biased government grantees and have made recommendations on ozone and particulate matter that aren’t supported by science. His arguments are twisted and unfounded, but are not surprising based upon his history working for industry front groups that attempt to spread disinformation to promote a science agenda benefitting powerful interests.

I want to set the record straight on the independence of EPA’s scientific advisory committees. Here’s what Steven Milloy gets very wrong:

  1. The EPA’s advisory committees have not been stacked with “activists.” In fact, industry representation is on par with representation from non-profit organizations.

I agree with Milloy on just one point: federal advisory committees must be balanced and unbiased. The Federal Advisory Committee Act mandates that all federal advisory committees are “fairly balanced in terms of the points of view represented and the functions to be performed.” This is an important piece of the act to ensure that the recommendations flowing from these advisory committees reflect a diversity of viewpoints and a range of expertise. There are also required conflict of interest disclosures made by each and every advisory committee member to ensure that any conflicts will not interfere with their ability to provide independent advice. Milloy claims that “only rarely do members have backgrounds in industry,” which is simply not true. An analysis of the EPA’s Science Advisory Board membership since 1996 reveals that 64 percent of the 459 members were affiliated with an academic institution, 9 percent with industry, 9 percent with non-governmental organizations (including industry-funded organizations like the Chemical Industry Institute of Toxicology), 8 percent with government, and 7 percent with consulting firms.  I found a similar breakdown in an analysis of the EPA’s Board of Scientific Counselors and for all seven of EPA’s scientific advisory committees.

In conversations I’ve had with former EPA Board of Scientific Counselors members, it has been clear that industry scientists have always had a voice on these committees which is why it’s especially suspect that the current administration has decided not to renew the terms of many advisory committee members, hoping to better represent the industry.

  1. Government grants are a major source of funding for academic scientists and these funds are contributing to research projects, not used for private gain.

Milloy’s claim that academic scientists who have received grant money from the EPA are making biased recommendations to the agency is completely unfounded. Receiving EPA funding for unrelated research projects is a fundamentally different thing than serving on a committee to make policy recommendations. EPA awards grants to academic scientists to learn more about scientific topics without a policy agenda and grantees are free to conduct the science and produce results any way they want. There is no predetermined or desired outcome, and the process is completely separate from EPA policy decisions. No incentives exist for committee members to come to a particular policy answer in order to get grant money on an entirely separate scientific research question from a separate office of an agency. To conflate these misunderstands how science and policy work. To Milloy, receiving a grant from government to work on science in the public interest would be biased in the same, if not more severe, way than receiving funds from a corporation to promote a product or otherwise support a private interest. For the work of advisory committee members, ensuring that federal science best supports public protections is key.

Congress’ attempt to correct this supposed problem, which is championed by Milloy in his piece, the EPA Science Advisory Board (SAB) Reform Act, includes a provision that board members may not have current contracts with the EPA or for three years after service which would only deter academic scientists from pursuing SAB positions. This Act is supported by the likes of Milloy because it would likely provide more opportunities for industry interests, not in need of government funding, to join the SAB.

  1. The advisory committee selection process is and should be based on expertise and experience related to the charge of the committee, not how many times an individual is nominated.

In his piece, Milloy calls the EPA’s advisory committee selection process “opaque” because a certain nominee wasn’t selected after having the most duplicate nominations. But, the EPA’s process for for selecting SAB and CASAC members is actually one of the most open and transparent processes across agencies and advisory committees. Members of the public have the opportunity to submit nominations, view nominees, and comment on the EPA’s roster of appointees before final selections are made. Ultimately, it’s up to the EPA administrator to decide the strongest and most balanced roster of committee members based on the needs of the agency. It’s not meant to be a process whereby any entity can win a nominee based on the number of comments received. Despite receiving 60 out of 83 nomination entries, Michael Honeycutt was likely not chosen to be a CASAC member because he has questionable scientific opinions and documented conflicts of interest. which are completely reasonable justifications.

  1. Particulate matter from power plants and vehicle emission does indeed have demonstrated health impacts, supported by the scientific literature.

Milloy’s article asserts that the claims that particulate matter has negative health impacts is not scientifically justified. This is demonstrably false. Not only is there a wealth of peer-reviewed literature backing up the claim, there is an entire field devoted to studying it. Milloy claims that there was no evidence of these impacts in 1996, but that’s because scientists weren’t collecting that data back then. While Milloy lives in the past, two decades worth of research (over 2,000 studies) since 1996 has shown that fine particulate matter (PM2.5) has been linked to strokes, heart disease, respiratory ailments and premature death.

Wall Street Journal’s second strike on EPA integrity

The Wall Street Journal’s readers deserve better than to read this junk-science drivel without full disclosure about the peddler of the disinformation. In 1993, Phillip Morris funded Milloy to lead an industry front group called the Advancement of Sound Science Coalition that cast doubt on the scientific evidence linking secondhand smoke to disease. In 1998, Milloy found a new benefactor in ExxonMobil, serving on a task force that mapped out ExxonMobil’s strategy to deceive the public about climate science, and funding him for many years to sow doubt under the guise of a slightly renamed front group, the Advancement of Sound Science Center run out of his Maryland home. Milloy’s current employer, the Energy and Environment Legal Institute, formerly the American Tradition Institute, is funded by the fossil fuel industry and has repeatedly filed inappropriate open records requests for the communications of climate scientists working at public universities. His most recent 2016 book is endorsed by none other than long-time junk science purveyor and climate change denier, Senator James M. Inhofe.

This is just a reminder that as we try to make sense of our government’s operations and the state of science for issues that affect our health and our planet’s health, we must consider the sources of our information very carefully. Facts matter, and here at UCS we’ll continue to draw attention to the silencing, sidelining, or distortion of scientific facts.



What is the Cost of One Meter of Sea Level Rise?

UCS Blog - The Equation (text only) -

The opening line of our recent Scientific Reports article reads “Global climate change drives sea level rise, increasing the frequency of coastal flooding.” Some may read this as plain fact. Others may not.

Undeniable and accelerating

100 years of data from tide gauges and more recently from satellites has demonstrated an unequivocal rise in global sea level (~8-10 inches in the past century). Although regional sea level varies on a multitude of time scales due to oceanographic processes like El Niño and vertical land motion (e.g., land subsidence or uplift), the overall trend of rising sea levels is both undeniable and accelerating. Nevertheless, variability breeds doubt. Saying that global warming is a hoax because it’s cold outside is like saying sea level rise doesn’t exist because it’s low tide.

Global sea level is currently rising at 34 mm/year, making it a relatively slow process. For instance, tides typically change sea level by 0.5-1.0 m every 12 hours, a rate that is ~100,000 times faster than global mean sea level rise.

It’s almost as if sea-level rise were slow enough for us to do something about it…

The civil engineering challenge of the 21st century

At the end of a recent news article by New Scientist, Anders Levermann, a climate scientist for the Potsdam Institute for Climate Impact Research, said “No one has to be afraid of sea level rise, if you’re not stupid. It’s low enough that we can respond. It’s nothing to be surprised about, unless you have an administration that says it’s not happening. Then you have to be afraid, because it’s a serious danger.”

Levermann’s quote captures the challenge of sounding the whistle on the dangers of climate change. We know that sea level rise is a problem; we know what’s causing it (increased concentrations of heat-trapping gasses like CO2 leading to the thermal expansion of sea water and the melting of land-based ice); we know how to solve the problem (reduce carbon emissions and cap global temperatures); yet, in spite of the warnings, the current administration recently chose to back out of a global initiative to address the problem.

Arguing that the Paris agreement is “unfair” to the American economy to the exclusive benefit of other countries is extremely shortsighted. This perspective serves to kick the climate-change can down the road for the next generation to pick up. This perspective, if it dominates US decision making moving forward, sets us up for the worst-case  scenarios of sea-level rise (more than two meters by 2100). Worse yet, this perspective may take us beyond the time horizon in which a straightforward solution may be found, leaving geoengineering solutions as our last-and-only resort.

If the Paris agreement is unfair to the American economy, imagine about how unfair 2.0+ m of sea-level rise would be.  We should seriously question the administration’s focus on improving national infrastructure without considering arguably the greatest threat to it.  Sea-level rise will be one of, if not THE greatest civil engineering challenge of the 21st century.

Sea level rise will:

An astronomically high dollar figure

As a thought experiment, try to quantify the economic value of one meter of sea level rise. Low-lying coastal regions support 30% of the global population and, most likely, a comparable percentage of the global economy. Even if each meter of sea level rise only affected a small percentage of this wealth and economic productivity, it would still represent an astronomically high dollar figure.

Although managed retreat from the coastline is considered a viable option for climate change adaptation, I don’t see a realistic option where we relocate major coastal cities such as New York City, Boston, New Orleans, Miami, Seattle, San Francisco, or Los Angeles.

What will convince the powers-that-be that unabated sea level rise is an unacceptable outcome of climate change? Historically, the answer to this question is disasters of epic proportions.

Hurricane Sandy precipitated large-scale adaptation planning efforts in New York City. Nuisance flooding in Miami has led to a number of on-going infrastructure improvements. The Dutch coast is being engineered to withstand once-in-10,000-year storms. Fortunately, most nations and US states, particular coastal states like Hawaii and California, will abide the Paris agreement.

This administration doesn’t seem to care about the science of climate change, but it does seem to care about economic winners and losers. Would quantifying the impacts of climate change in terms of American jobs and taxpayer dollars convince the administration to change their view of the Paris agreement?

Impossible to ignore

In the executive summary of the 2014 Risky Business report, Michael Bloomberg writes, “With the oceans rising and the climate changing, the Risky Business report details the costs of inaction in ways that are easy to understand in dollars and cents—and impossible to ignore.” This report finds that the clearest and most economically significant risks of climate change include:

  • Climate-driven changes in agricultural production and energy demand
  • The impact of higher temperatures on labor productivity and public health
  • Damage to coastal property and infrastructure from rising sea levels and increased storm surge

For example, the report finds that in the US by 2050 more than $106 billion worth of existing coastal property could be below sea level. Furthermore, a study in Nature Climate Change found that future flood losses in major coastal cities around the world may exceed $1 trillion dollars per year as a consequence of sea level rise by 2050.

The science and economics of climate change are clear.

So why do politicians keep telling us that it’s not happening and that doing something about it would be bad for the economy?

New Interactive Map Highlights Effects of Sea Level Rise, Shows Areas of Chronic Flooding by Community

UCS Blog - The Equation (text only) -

Last week, the Union of Concerned Scientists released a report showing sea level rise could bring disruptive levels of flooding to nearly 670 coastal communities in the United States by the end of the century. Along with the report, UCS published an interactive map tool that lets you explore when and where chronic flooding–defined as 26 floods per year or more–will force communities to make hard choices. It also highlights the importance of acting quickly to curtail our carbon emissions and using the coming years wisely.

Here are a few ways to use this tool:
  1. Explore the expansion of chronically inundated areas

    Sea level rise will expand the zone that floods 26 times per year or more. Within the “Chronic Inundation Area” tab, you can see how that zone expands over time for any coastal area in the lower 48 and for two different sea level rise scenarios (moderate and fast).

    Explore the spread of chronically inundated areas nationwide as sea level rises.


  2. Explore which communities join the ranks of the chronically inundated

    We define a chronically inundated community as one where 10% or more of the usable land is flooding 26 times per year or more. With a fast sea level rise scenario, about half of all oceanfront communities in the lower 48 would qualify as chronically inundated. Check out the “Communities at Risk” tab to see if your community is one of them.

    Explore communities where chronic flooding encompasses 10% or more of usable land area.


  3. Visualize the power of our emissions choices

    Drastically reducing global carbon emissions with the aim of capping future warming to less than 2 degrees Celsius above pre-industrial levels–the primary goal of the Paris Agreement–could prevent chronic inundation in hundreds of U.S. communities. Explore the “Our Climate Choices” tab to see the communities that would benefit from swift emissions reductions.

    Explore how slowing the pace of sea level rise could prevent chronic inundation in hundreds of US communities.


  4. Learn how to use this time wisely

    Our country must use the limited window of time before chronic inundation sets in for hundreds of communities, and plan and prepare with a science-based approach that prioritizes equitable outcomes. Explore our “Preparing for Impacts” tab and consider the federal and state-level policies and resources that can help communities understand their risks, assess their choices, and implement adaptation plans. This tab captures how we can use the diminishing response time wisely.

    Explore federal and state-level resources for communities coping with sea level rise.

Improving the map based on data and feedback

We hope that communities are able to use this tool to better understand the risks they face as sea level rises. We welcome your feedback and will be periodically updating the map as new data and new information comes to light.

Climate Change Just Got a Bipartisan Vote in the House of Representatives

UCS Blog - The Equation (text only) -

On rare occasions, transformative political change emerges with a dramatic flourish, sometimes through elections (Reagan in 1980, Obama in 2008) or key mass mobilizations (the March on Washington in 1963), or even court cases (the Massachusetts Supreme Judicial Court decision declaring marriage inequality unconstitutional.)

But most of the time, transformations happen slowly, step by arduous step, along a path that may be hard to follow and can only be discerned clearly in hindsight.

I believe that we are on such a path when it comes to Republican members of Congress acknowledging climate science and ultimately the need to act. I see some encouraging indications that rank and file Republican members of Congress are heading in the right direction.

In February, 2016, Democratic Congressman Ted Deutch and Republican Congressman Carlos Curbelo, launched the Climate Solutions Caucus, whose mission is “to educate members on economically-viable options to reduce climate risk and to explore bipartisan policy options that address the impacts, causes, and challenges of our changing climate.” Its ranks have now swelled to 48 members, 24 Republicans and 24 Democrats.

Last week, this group flexed its muscle. At issue was UCS-backed language in the National Defense Authorization Act (NDAA). The provision, authored by Democratic Congressman Jim Langevin, would require the Pentagon to do a report on the vulnerabilities to military instillations and combatant commander requirements resulting from climate change over the next 20 years. The provision also states as follows:

Climate change is a direct threat to the national security of the United States and is impacting stability in areas of the world where the United States armed forces are operating today, and where strategic implications for future conflicts exist.

Republican leadership led an aggressive effort to strip the language from the NDAA on the House floor through an amendment offered by Representative Perry (PA-R). But in the end, 46 Republican members (including all but one of the entire climate solutions caucus) voted against it, and fortunately it was not adopted.

We are hopeful this important provision will be included in the final NDAA bill that passes the senate, and then on to President Trump for his signature. He probably won’t like this language, but it seems doubtful that he will veto a military spending bill.


One shouldn’t read too much into this. The amendment is largely symbolic, and the only thing it requires is that the defense department conduct a study on climate change and national security. There is a long way to go from a vote such as this one to the enactment of actual policies to cut the greenhouse gas emissions that are the primary cause of climate change.

But, it is an important stepping stone. If this bill becomes law, a bipartisan congressional finding that climate change endangers national security becomes the law of the land. Among other things, this should offer a strong rebuttal to those who sow doubt about climate science.

It is also a validation of a strategy that UCS has employed for many years—to highlight the impacts of climate change in fresh new ways that resonate with conservative values. This was the thinking behind our National Landmarks at Risk report, which shows how iconic American landmarks are threatened by climate change.

This was also our strategy behind our recent report which highlights the vulnerability of coastal military bases to sea level rise. This report was cited and relied upon by Congressman Langevin in his advocacy for the amendment.

UCS will work to make sure that this language is included in the final bill, and we will continue to find other ways to cultivate bi-partisan support for addressing climate change. There will be much more difficult votes ahead than this one. But for now, I want to thank the Republican members of Congress for this important vote, and make sure our members and supporters know that the efforts of ours and so many others to work with Republicans and Democrats, and to bring the best science to their attention, is paying off.

Build the Wall and Blame the Poor: Checking Rep. King’s Statements on Food Stamps

UCS Blog - The Equation (text only) -

If you read “Steve King” and think of novelist Stephen King, don’t worry too much about it.

Iowa Representative Steve King dabbled in fear and fiction himself in an interview with CNN last Wednesday, suggesting that a US-Mexico border wall be funded with dollars from Planned Parenthood and the food stamp program.

Photo: CC BY SA/Gage Skidmore

This particular idea was new, but the sentiments King expressed about the Supplemental Nutrition Assistance Program (SNAP) and the people who use it, less so. With 2018 farm bill talks underway, misconceptions about the program and who it serves have manifested with increasing frequency, and setting the record straight about these misconceptions is more important than ever. Policymakers like King, who is a member of both the House Committee on Agriculture and its Nutrition Subcommittee, hold the fate of 21 million SNAP households in their hands, and it’s critical that they’re relying on complete and correct information to make decisions about this program.

Here’s a quick deconstruction of what was said—and what needs to be said—about Americans who use food stamps.

“And the rest of [the funding beyond Planned Parenthood cuts] could come out of food stamps and the entitlements that are being spread out for people that haven’t worked in three generations.”

The idea that food stamp users are “freeloaders” is perhaps one of the most common and least accurate. The truth is, most SNAP participants who can work, do work. USDA data shows that about two-thirds of SNAP participants are children, elderly, or disabled; 22 percent work full time, are caretakers, or participate in a training program; and only 14 percent are working less than 30 hours per week, are unemployed, or are registered for work. Moreover, among households with adults who are able to work, over three-quarters of adults held a job in the year before or after receiving SNAP—meaning the program is effectively helping families fill temporary gaps in employment. King’s constituents are no exception: in his own congressional district, over half of all households receiving SNAP included a person who worked within the past 12 months, and over a third included two or more people who worked within the past 12 months.

“I would just say let’s limit it to that — anybody who wants to have food stamps, it’s up to the school lunch program, that’s fine.”

The national school lunch program provides one meal per day to eligible children. Kids who receive free or reduced price lunch are also eligible to receive breakfast at school through the program, but only about half do. Even fewer kids receive free meals in the summer: less than ten percent of kids who receive free or reduced price lunch at school get free lunches when they’re out of school. This means that, for millions of families, SNAP benefits are critical to filling in the gaps so kids can eat. In fact, USDA data shows that more than 4 in 10 SNAP users are kids. Again, these patterns hold true in King’s district: over half the households that rely on SNAP benefits include children.

“We have seen this go from 19 million people on, now, the SNAP program, up to 47 million people on the SNAP program.”

True. In 1987, average participation in SNAP was around 19 million. In 2013, it peaked at 47 million, and dropped to around 44 million by 2016. The increase over this time period is attributable, at least in part, to changes in program enrollment and benefit rules between 2007 and 2011 and greater participation among eligible populations. However, participation data also demonstrates SNAP’s effective response to economic recession and growth. For example, there was an increase in 2008 as the recession caused more families to fall below the poverty line, and in 2014, for the first time since 2007, participation and total costs began to steadily decrease in the wake of economic recovery. Congressional Budget Office estimates predict that by 2027, the percentage of the population receiving SNAP will return close to the levels seen in 2007.

“We built the program because to solve the problem of malnutrition in America, and now we have a problem of obesity.”

It is undeniable that rising rates of obesity are a significant public health threat. But obesity is an incredibly complex phenomenon, the pathophysiology of which involves myriad social, cultural and biological factors. It is a different type of malnutrition, and we will not solve it simply by taking food away from those who can’t afford it. If we want to focus on increasing the nutritional quality of foods eaten by SNAP recipients, we can look to programs that have been successful in shifting dietary patterns to promote greater fruit and vegetable intake, using strategies such as behavioral economics or incentive programs. Truth be told, most of us—SNAP users or not—would benefit from consuming more nutrient-dense foods like fruits and vegetables.

“I’m sure that all of them didn’t need it.”

Without a doubt. And this could be said of nearly any federal assistance program. But the goal of the federal safety net is not to tailor programs to the specific needs of each person or family—this would be nearly impossible, and the more precise a system gets, the more regulation is required and the greater the administrative burden and financial strain becomes. The goal of federal assistance programs like SNAP is to do the most good for the greatest amount of people, within a system that most effectively allocates a limited amount of resources. And I’d venture to say that a program designed to lift millions of Americans out of poverty—with one of the lowest fraud rates of any federal program, an economic multiplier effect of $1.80 for every $1 spent in benefits, and an ability to reduce food insecurity rates by a full 30 percent—comes closer to hitting its mark than a wall.


Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs