UCS Blog - The Equation (text only)

Science and Democracy Engages the Science of Democracy: The Kendall Voting Rights Fellowship

Photo: Peter Dutton/CC BY-NC-SA 2.0 (Flickr)

This fall, I am excited to help launch a new chapter in the Union of Concerned Scientists’ commitment to putting science to work toward building a healthier planet and a safer world. My research training is in the field of electoral systems and their impact on representation and public policy. I am most recently a co-author on the book Gerrymandering in America: The Supreme Court, the House of Representatives and The Future of Popular Sovereignty. As the new Kendall Voting Rights Fellow, I will be studying the impact of elections on many of the broader policy goals that UCS is pursuing.

In this context, a healthier and safer world means an open, resilient electoral system that can fairly and accurately convert aggregated preferences into policy choices. There are a number of opportunities where UCS can make an important contribution to improving voting rights and electoral institutions.

Fighting voter fraud and voter suppression

Perhaps no area in the field of voting rights has received more attention than President Trump’s claim that millions of people voted illegally to deny him a popular vote majority. The entire event serves as a painful reminder of how scientists can lose control over how their research is used. While we can say with absolute certainty that the president’s claim is false, there is no scientific consensus on how to best estimate levels of voter fraud.

We know that voter fraud occurs, but techniques for accurately estimating fraudulent votes cast, which lie somewhere between allegations and convictions, are difficult (but see discussions here and here). UCS will be working to provide the public and policy makers with the most accurate science available, and we will work with partner organizations to build safeguards that assure the integrity of voting for all who are eligible, especially those who are at risk of being disenfranchised through overly restrictive eligibility and voting requirements.

Similarly, there is a great deal of anecdotal information about the negative impact of voter suppression tactics. Previous scientific studies have shown that laws like voter identification have a negative impact on minority groups, but more recent work has been critiqued on methodological grounds. UCS will work to identify the participatory consequences of administrative laws ranging from early registration deadlines to online and automatic registration, early voting, and ballot access laws.

Improving policy outcomes through electoral integrity

When voting rights are compromised, it is typically because policy makers are attempting to insulate themselves from electoral pressure on critical policy issues. As we have seen in areas as diverse as climate policy, transportation, food systems and even global security, when policy makers are shielded from electoral accountability, they are more susceptible to the influence of powerful, narrow interests.

Scientists are in a unique position to bring policy expertise to questions regarding the consequences of restrictive election laws. In addition to assessing the extent to which our current electoral methods, including administrative law, electoral districting, and the Electoral College, create opportunities and threats to electoral integrity and security, we will work to identify how these methods impact legislative policymaking and social outcomes.

UCS already plays a crucial role in educating the public and combating environmental racism and public health risks. However, there is a gap in research exploring the link between electoral gamesmanship and the environmental and health injustices that afflict the most vulnerable communities. UCS can draw on expertise across numerous fields to help fill this gap.

In addition, analyses and research products on electoral integrity that UCS can provide will allow us to strengthen existing partnerships with communities dedicated to the advancement of environmental justice, political equality and human rights. From local communities to the U.S. Capitol, we will work with organizations to improve electoral integrity though the adoption of open and secure election laws.

Developing a reform agenda

An area that should be of particular interest to the UCS community and members of the Science Network is election information security and technology, a field where computer scientists have as much to say as political scientists. There is widespread agreement that not only are many of the nation’s voting machines outdated and vulnerable to hacking, but that cyber-attacks on election software and records will play an ever-increasing roll as a threat to electoral integrity.

Moreover, there is increasing evidence that the very structure of systems such as the Electoral College create security threats by focusing the attention of hackers and disinformation campaigns on a small number of states that can swing a presidential election. UCS will partner with advocacy organizations to analyze threats and innovations in election security, and advocate for evidence-based policies to address these threats.

And as this month’s landmark gerrymandering case before the Supreme Court also made clear, scientists are playing a major role in providing recommendations about how to determine racial and partisan discrimination in districting plans, and studying the consequences of proposed legislative remedies. In addition to identifying causes and remedies for electoral discrimination, UCS experts can help fill the gap of understanding how such discrimination affects environmental, health and related policies, which tend to negatively impact specific populations.

Rigorous analysis of the impact of electoral integrity on policy, and the ways that electoral discrimination impacts our quality of life, will provide critical support needed for reform. The challenges are clear, but so is the mission: to understand and engineer the democratic institutions that we need to build a healthier planet and a safer world.

I Am a 30-Year Veteran Scientist from US EPA; I Can’t Afford to Be Discouraged

. . . And neither can you.

Since January, we have seen a continual assault on our environmental protections. EPA has put a political operative with no scientific experience in charge of vetting EPA grants, and the agency is reconsidering an Obama-era regulation on coal ash. The well-established legal processes for promulgating environmental regulations, and—very pointedly—the science underlying environmental regulation are being jettisoned by the Trump administration. As scientists, we must stand up for science and ensure that it is not tossed aside in public policy and decision-making.

Rigorous science is the foundation of EPA

Attending a march with some friends.

While at US EPA, I served as a senior scientist in human health risk assessment.  I was among the cadre of dedicated professionals who worked long, hard, and intelligently to provide the science supporting management of risks from exposure to environmental contaminants. Often, we engaged in the demanding practice of issuing regulation.

Regulations to limit human and environmental exposure are not developed overnight.  The laws that enable US EPA to issue regulations specify requirements and procedures for issuing rules; these can include notice of proposed rulemaking, multiple proposed rules, public comments on proposals, responses to comments, more proposals, more comments, review by other Federal bodies, review by States, review by Tribal governments—review, review, review. Often, the environmental laws also note requirements for the science leading to risk management choices. For example, the Safe Drinking Water Act of 1996 (SDWA) requires several judgments to be met affirmatively before any contaminant can be limited through regulation.

The US EPA Administrator must base his or her judgment, among other factors, on what SDWA calls the best available, peer-reviewed science.  This refers not only to experimental or epidemiologic studies, but also to the US EPA documents analyzing the risks and the best ways to mitigate them.

Requirements to regulate environmental contaminants in other media are no less rigorous.  To regulate emissions from coal- and oil-fired boilers used in electrical power generation, US EPA engaged in major scientific programs to understand the nature of these air pollutants (including toxic mercury), the risks they pose, and how best to deal with them. This began in 1993 and culminated in the Mercury and Air Toxics Standards (MATS) finalized in 2012. Building the scientific basis for the rule spanned several administrations and a few careers.  It was frustrating at times, and exhausting, but we kept our focus on the goal of doing the right thing to improve public health.

Regulation protects the public—and we’re watching it be undermined

The message here is that environmental regulation based on sound science is not a trivial exercise, nor should it be. Regulation can be costly, and sometimes may have societal impacts. But ask anyone who has lived in a society without sound environmental regulation, and she will tell you that legally enforceable limits on environmental contaminants are necessary. We estimated that each year the implemented MATS rule prevents 11,000 premature deaths and more than 100,000 heart and asthma attacks. And it greatly reduces release of mercury, which accumulates in fish and poses risk of neurotoxic effects to both developing children and adults.

The process that EPA follows to publish a regulation must also be used to reverse a regulatory action. Creating regulations is not a simple process—but undermining, overturning, and not enforcing regulations is easy and has major consequences for health and the environment. I fear that both the process and the science are being given short shrift as this administration acts to reverse sound regulatory decisions made by US EPA. This dismantling of environmental protection has begun in earnest, and I expect it will have severe, long-lasting effects.

Scientists must defend evidence-based regulation

There are ways to impede the regulatory roll-back. Writing, calling, emailing elected officials is one avenue. Another avenue is joining groups such as Save EPA, an organization of retired and former US EPA employees with expertise in environmental science, law, and policy. We are using our collective skills to educate the public about environmental science, environmental protections, and the current Administration’s assault on US EPA and our public health. You can help by reading our guide to resisting de-regulation; submitting public comments on rules being considered for rollback; and supporting our efforts to defend environmental regulations. As scientists, we must continue to insist on the validity and thoroughness of our discipline, and we must repeatedly communicate about this to decision-makers. In one of many hearings and reviews of mercury hazard, my late scientist friend and US EPA veteran Kathryn Mahaffey quoted John Adams: “Facts are stubborn things.” She was right.

Rita Schoeny retired from USEPA in 2015 after 30 years, having served in roles such as Senior Science Advisor for the Office of Science Policy, Office of Research and Development, and as the Senior Science Advisor, Office of Science and Technology, Office of Water. She has been responsible for major assessments and programs in support of the several of EPA’s legislative mandates including the Safe Drinking Water Act, Clean Water Act, Clean Air Act, and Food Quality Protection Act. Dr. Schoeny has published extensively in the area of human health risk assessment.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

How the NFL Sidelined Science—and Why It Matters

Photo: Nathan Rupert/CC BY-NC-ND 2.0 (Flickr)

Football was not just the most important social activity on weekends in New Jersey growing up, but it was woven into the family and community in which I grew up. My dad played football in his small town Vermont high school along with his older brother who went on to play college football at the University of Vermont. Hence, weekends at the Reed household were for screaming at TV sets or from real-life bleachers and theatrical displays of cheering played out in falling off of couches and crashing onto floors.

Football players have always carried a sort of badge of honor for playing America’s favorite sport, but it wasn’t until recently that that badge began to carry even more weight due to emerging knowledge about what even just a few years of executing football plays could mean for their quality of life down the line.

Even if you don’t closely follow football, you are likely aware that the NFL has been at the center of the news cycle in recent months, with players kneeling during the playing of the national anthem to protest racial injustice and police brutality. The players’ protest has drawn fire from a number of directions, including the White House. (Here at UCS, our staff joined with the campaign #scientiststakeaknee, supporting these players’ right to protest and the importance of their cause.)

But these protests aren’t the only way that the NFL has come into the spotlight. It’s increasingly clear that the repeated head injuries many football players experience can cause long-term damage—but the NFL has worked hard to bury these facts.

The NFL’s foray into CTE science

A powerful slide from Dr. Ann McKee’s presentation summarizing her findings on CTE at the Powering Precision Health Summit. The BU Brain Bank most recently analyzed 111 brains of former NFL players, finding that all but one had signs of CTE.

The NFL spent years, beginning in the 1990s, working to control the science behind the health consequences of repeated head injuries incurred while playing football. By doing so, the company infringed on its players’ right to know and ability to make informed decisions about their health and career paths. And as the NFL failed to do its due diligence to conduct honest science on the game, players were consistently told to return to play after collisions only to be left with debilitating health issues and devastated family members.

The NFL’s actions closely track with the tobacco and fossil fuel industries, and include examples of just about every tactic in our “Disinformation Playbook,” which are documented in Steve Fainaru and Mark Fainaru-Wada’s 2013 book, League of Denial. Just a few uses of the plays include:

The Fake: The NFL commissioned a Mild Traumatic Brain Injury (MTBI) committee that published a series of studies in the journal Neurosurgery in the early 2000s, which downplayed the risks of repeated head injuries by cherrypicking data and using incomplete data on the number of concussions that were reported during games.

The Blitz: Bennett Omalu, the pathologist who first discovered CTE in an NFL player, faced opposition from the NFL which called for the retraction for his article on the subject in 2005 and then called his second study “not appropriate science” and “purely speculative.” The second chair of NFL’s brain injury committee, Ira Casson, later attacked and mocked Boston University neuropathologist, Dr. Ann McKee, for her work on CTE.

The Diversion: Ira Casson acquired the nickname “Dr. No” by the authors of League of Denial as he willfully refused to accept that repeated head injury could lead to long-term brain damage in football players, even though he spent years studying boxers and had concluded that the sport was associated with brain damage. In a 2010 Congressional hearing on football brain injuries, he held tight to his denial of the link, telling members of Congress that, “My position is that there is not enough valid, reliable or objective scientific evidence at present to determine whether or not repeat head impacts in professional football result in long term brain damage.”

The Fix: The NFL was able to manipulate processes in order to control the science on head injuries sustained while playing football. The editor-in-chief of the journal Neurosurgery in which all of the MTBI’s studies were published was Dr. Michael Apuzzo, a consultant for an NFL football team. The peer review process for this journal, unlike others, allowed papers to be published even if reviewers were harshly critical and rejected the science as long as the objections were published in the commentaries section of the paper. Despite harsh criticism from reviewers who were prominent experts in the field, Dr. Julian Bailes and Dr. Kevin Guskiewicz, the MTBI got away with publishing a series of papers downplaying the health risks of playing football.

In 2016, the NFL finally admitted that there was a link between playing football and the development of degenerative brain disorders like CTE after denying the risks for over a decade. The NFL has since changed some of its rules and has dedicated funding to help make the game safer for players, protections that President Trump argues are “ruining the game.” Trump’s blatant disregard of the evidence on the health impacts of playing football is beyond disappointing but not at all surprising, considering the way that this administration has treated science since day one.

From NFL player to science champion

Chris and I behind the scenes after a full day filming the PSA this August.

I have been fortunate to meet and spend time with former NFL player and science champion, Chris Borland, who has turned his frustration with the league into support for independent science on the impacts of playing football for its child and adult players. Yesterday, he spoke at a scientific conference on the role of the media and others in communicating the CTE science to the general public, so that we all have a better understanding of the risks of playing football, especially during youth. He also spoke about the emerging science on biomarkers that will help diagnose CTE in living players in the near future.

Here’s Chris’s take on why we should be standing up for science and exposing and fighting back against the disinformation playbook:

Chris and I are also featured on this week’s Got Science? podcast. Listen below:

In carrying out plays from the Playbook to sideline science, companies like the NFL break a simple social responsibility to “do no harm.” Take a look at our brand new website detailing the case study of the NFL along with 19 other examples of ways in which companies or trade organizations have manipulated or suppressed science at our expense, and find out how you help us stop the playbook.

Photo: Nathan Rupert/CC BY-NC-ND 2.0 (Flickr)

Electric Vehicles, Batteries, Cobalt, and Rare Earth Metals 

Battery Pack for BMW-i3 Electric Vehicle (at Munich Trade-Show Electronica). Photo: RudolfSimon CC-BY-2.0 (Wikimedia)

The case for switching to electric vehicles (EVs) is nearly settled. They are cheaper to use, cut emissions, and offer a whisper quiet ride. One of the last arguments available to the EV-hater club, which is largely comprised of thinly veiled oil-industry front groups funded by the Koch brothers, surrounds the impacts from the materials used to make an EV’s battery pack.

Specifically, the use of lithium, cobalt, nickel, and other metals that are part of an EV lithium-ion battery pack has raised red flags toward the poor human rights and worker protection records in the countries where these materials are mined.

A lot of these warnings have been incorrectly categorized under “EVs and rare earth metals.” Though neither lithium nor cobalt are rare earth metals, and rare earth metals aren’t nearly as rare as precious metals like gold, platinum, and palladium, there are important issues surrounding the production of lithium-ion batteries that must be acknowledged and addressed.

It is also important to note that these impacts are not happening just because of EVs. They are also being driven by the global demand for cell phones, laptop computers, and the multitude of other electronic devices that use lithium-ion batteries.

As EVs gain market share, they will be more responsible for the impacts from battery production. But today, EVs comprise a small fraction of global vehicle sales. So, concerns about lithium-ion batteries should be directed not just to the suppliers of EV battery packs, but also toward Apple, Samsung, and the other companies that source lithium-ion batteries for their electronic goods.

Let’s also not forget that the supply chain for gasoline-powered vehicles has its fair share of issues, ranging from human rights violations like the use of child labor, to disastrous oil spills like Deepwater Horizon. But unlike gasoline-powered vehicles, EVs will be able to take advantage of emerging battery chemistries that don’t rely on cobalt or other materials that are linked to exploitative practices.

Cobalt and electric vehicle batteries

Cobalt, a bluish-gray metal found in the Earth’s crust, is one of today’s preferred components used to make the lithium-ion batteries that power laptops, cell phones, and EVs.  Cobalt is mined all over the world, but 50 to 60 percent of the global supply comes from the Democratic Republic of Congo (DRC), which has a poor human rights track record. According to UNICEF and Amnesty International, around 40,000 children are involved in cobalt mining in DRC where they make only $1 – $2 USD per day. DRC’s cobalt trade has been the target of criticism for nearly a decade, and the U.S. Labor Department lists Congolese cobalt as a product it has reason to think is produced by child labor. More troubling, cobalt demand has tripled in the past five years and is projected to at least double again by 2020.

What can be done about EV batteries sourcing issues

First, companies should be held accountable for enacting and enforcing policies to only use ethically-sourced materials. Some companies are off to a good start. Apple has pledged to end its reliance on mining altogether, and one day make its products from only renewable resources or recycled materials. Other tech giants like HP, Samsung, and Sony joined an effort called the “Responsible Cobalt Initiative.” Members of the initiative pledged to follow global guidelines for mining supply chains, which call for companies to trace how cobalt is being extracted, transported, manufactured and sold.

On the EV side of things, Tesla has committed to sourcing materials only from North America for its new battery production facility, the Gigafactory.  In 2015, Tesla secured two contracts with mining companies to explore lithium deposits in northern Nevada and Mexico, though Tesla still relies on cobalt that may have been sourced from the DRC.

Both Ford and GM get their EV batteries from LG Chem, who has said they have stopped using DRC-sourced cobalt and stated that neither Ford nor GM batteries rely on DRC-sourced cobalt, but some of the LG practices and statements have been called into question by the WaPo.

Second, recycling can help reduce the need to search for new source of battery materials, or rely on sourcing materials from countries with poor worker protections. Cobalt, for example, (as opposed to gasoline) is fully recyclable and roughly 15 percent of U.S. cobalt consumption is from recycled scrap today.

Companies like Umicore are in the cobalt recycling business and have demonstrated that there is a business model for recycling cobalt that can help reduce demand for DRC-mined cobalt.

Third, battery technology is continuing to improve. The multitude of battery applications has generated a strong financial incentive for researchers to find the next greatest battery chemistry, and some of the most promising next-gen battery types don’t rely on cobalt at all.

Lithium-titanate and lithium-iron-phosphate, for example, are gaining importance in EV powertrain applications and don’t need cobalt. Other battery chemistries that rely on magnesium, sodium, or lithium-sulfur are also gaining traction as they have the potential to beat lithium-ion batteries on energy density and cost. Battery research has seen a big shift in recent years. Nearly half of the presentations at the Battery Symposium in Japan were once about fuel cells and lithium-ion battery materials. But since 2012, these topics have been supplanted by presentations about solid-state, lithium-air and non-lithium batteries.

Overall, the human rights issues related to the lithium-ion battery supply chain cannot be ignored. At the same time, they shouldn’t be used by the oil industry and their allies as a rallying cry to dismantle EV policy support, or as reason to stop the growth of the EV industry. Again, it’s not just EVs that are at issue here. All manufacturers of electronic devices need to find better sources for their batteries and it is their responsibility to source materials from places that have worker protections. It’s also the responsibility of our government to ensure that Americans can buy products that are ethically and sustainably sourced.

Post-Harvey, the Arkema Disaster Reveals Chemical Safety Risks Were Preventable

Halloween is right around the corner, but the Environmental Protection Agency (EPA) has been a perpetual nightmare to public safety since Administrator Scott Pruitt arrived, sending long-awaited chemical safety amendments to an early grave this year. The Risk Management Plan (RMP) is a vital EPA chemical safety rule that “requires certain facilities to develop plans that identify potential effects of a chemical accident, and take certain actions to prevent harm to the public or the environment”—but delays to the effective date of the long-awaited updates are putting communities, workers, and first responders directly in the way of harm, as we have witnessed from recent events following Hurricane Harvey.

Last Friday, the Union of Concerned Scientists released a report finding evidence of harm caused to communities by RMP-related incidents in the wake of Hurricane Harvey. The report serves as a supporting document in a lawsuit UCS is involved with against the EPA decision to delay the long overdue RMP update. Our objective was to further highlight how, if the improvements to the RMP had been allowed to go into place as planned, damage from chemical plants during Hurricane Harvey could have been diminished. We provided an in-depth analysis of the steps the Arkema facility could have taken with the proposed changes in effect, and estimated the toxic burden that the surrounding community was exposed to.

Additionally, we examined other incidents (i.e. spills, releases, explosions) that occurred at chemical facilities during Hurricane Harvey, as well as emphasizing the disproportionate impacts of chemical incidents on communities of color and low-income communities. I have taken “toxic tours” on Houston’s east side with our partners at Texas Environmental Justice Advocacy Services (t.e.j.a.s.), and am all too aware that disparities in distribution of RMP facilities and concentration of toxic pollutants exist and are mostly unnoticed by the unaffected public.

Could Arkema have been avoided?

In late August of this year, Hurricane Harvey unleashed massive quantities of rain upon Houston and surrounding towns, flooding streets, homes—and chemical facilities. As a result, the Arkema plant in Crosby, Texas, a town 25 miles northeast of Houston, was inundated with floodwater and left without power or working generators. This meant the refrigerators needed for cooling volatile organic peroxides were not operational, which ultimately led to the exploding of 500,000 pounds of the unstable chemicals. Though we cannot say Arkema would not have had an incident, we do know the damage inflicted upon first responders and nearby residents could have been mitigated by implementing the revisions to the chemical safety rule, which would have required:

  • Coordination with local emergency response agencies—RMP has standards that would require industries to coordinate and provide information to emergency responders. This would have likely prevented the Crosby first responders from being exposed to noxious fumes—at the perimeter of the ineffective 1.5-mile evacuation zone, I might add—after the Arkema explosion. A group of injured first responders are now suing the plant for failing to properly warn the responders of the dangers they faced.
  • Analysis of safer technologies and chemicals—The facility would have had to begin research into safer technologies and chemicals for use in their facility, including less volatile chemicals, or safer storing and cooling techniques to preempt an explosion.
  • Root cause analyses—A thorough investigation of past incidents to prevent similar future incidents would have been required of RMP facilities.
Communities are at risk

Real life isn’t an action film, where explosions abound and the dogged hero emerges from a fiery building unscathed, nary a casualty to be found.  In real life, the consequences of a chemical explosion, leak, or spill are often dangerous and deadly. We have ways to hold chemical facilities accountable for taking necessary preventative measures, but we must urge the EPA to implement the changes to the proposed RMP rule in order to do so.

Voting Technology Needs an Upgrade: Here’s What Congress Can Do

Voting systems throughout the United States are vulnerable to corruption in a variety of ways, and the federal government has an obligation to protect the integrity of the electoral process. At a recent meeting of the National Academies of Sciences, Engineering and Medicine’s Committee on the Future of Voting, the Department of Homeland Security’s Robert Kolasky put it bluntly: “It’s not a fair fight to pit Orange County (California) against the Russians.”

While the intelligence community has not confirmed that the hackers working on behalf of the Russian government to undermine the 2016 election were successful at tampering with actual vote tallies, they certainly succeeded at shaking our confidence in the electoral process, which over time could undermine faith in democracy.

The management of statewide eligible voter lists is a particularly challenging but crucial responsibility. On the one hand, data entry errors, duplicate records and “live” records for deceased voters invite voter fraud and inaccuracies in voting data. On the other hand, overly broad purging of voter lists can result in the exclusion of eligible voters from the rolls.

Two problems with voter list maintenance

Validation of voter eligibility is typically done through “matching” of individuals on voter registration lists with other databases using unique combinations of traits of eligible individuals (birthdays and names, etc.). This process is error-prone in two ways. First, data may not be entered identically for individuals across databases (misspelled names, missing data, etc.), so that individuals fail to get matched and are excluded (false negatives). Second, the computer algorithms used to identify and match records may be imprecise, such that they match the wrong people (false positives) or exclude people from voter lists based on faulty matching techniques (false negatives).

Both assumptions, that matching databases have the correct data, and that the algorithm for identifying individual matches actually does so, have proven challenging. For example, research has shown that the surprisingly high probability that two people in a group of a given size share the same birthday can largely account for inflated estimates of double registrations and people double voting. That is, an algorithm that matches on birthday, and possibly last name, is a poor method for identifying voters, because lots of people share those traits.

But even that poor algorithm assumes that the underlying data is accurate, when it is often not. Even databases containing precisely individualized identifiers, like social security numbers, include enough error to be inappropriate for matching. Indeed, the Social Security Administration accidently declares over 10,000 people dead every year, and attempts to match voter lists with the last four SSN digits have produced error rates above 20%, such that the SSA Inspector General has warned against its use for this purpose.

Sloppy matching algorithms that do not attempt to correct for such data inaccuracies are prone to exclude high numbers of eligible voters. For example, the Crosscheck system, developed by the president’s Electoral “Integrity” Commission Chair Kris Kobach, has actually produced error rates as high as 17% in Chesterfield County, VA, prompting them to abandon the software.

Two solutions that improve voter list management

The solution to these problems is thus twofold: improving the quality of matching algorithms in order to create precise identifiers and overcome data inaccuracies, and reducing the probability of ineligible voters or inaccurate data getting on the voter list to begin with.

Recent advances in algorithmic design have shown that using multiple matching criteria with recoded data to account for common data entry inaccuracies can yield matches that are 99% accurate. For example, Stephen Ansolabehere and Eitan D. Hersh have demonstrated that using three-match combinations of Address (A), Date of Birth (D), Gender (G) and Name (N), or ADGN, is extremely effective in successful matching (and helps explain how Facebook knows everything about you).

For securing and maintaining precise voter list data from the start, the implementation of automatic voter registration, or AVR, is proving increasingly effective and popular. By automatically registering all eligible adults (unless they decline) when people update their own data through government agencies, and by transferring that data electronically on a regular basis, the process “boosts registration rates, cleans up the rolls, makes voting more convenient, and reduces the potential for voter fraud, all while lowering costs” according to the Brennan Center for Justice, which advocates AVR.

Ten states and the District of Columbia have already approved AVR, and 32 states have introduced AVR proposals this year. It is a politically bi-partisan solution, with states as different as California and Alaska having already adopted the practice.
Road Island’s Democratic Secretary of State, Nellie Gorbea, has stated that “Having clean voter lists is critical to preserving the integrity of our elections, which is why I made enacting Automatic Voter Registration a priority.”

Republican Governor of Illinois Bruce Rauner, on signing his state’s AVR law, explained that “This is good bipartisan legislation and it addresses the fundamental fact that the right to vote is foundational for the rights of Americans in our democracy.”

Given the seriousness of the threat, and the fact that such effective solutions for voter list management have already been developed, Congress should ensure that states have the capacity to implement these policies, which are among the most important infrastructure investments that we can make.

Memo to EPA Chief Pruitt: End Subsidies for Fossil Fuels, Not Renewables

Economically and environmentally, it would be far better for the future of the planet to phase out fossil fuel subsidies and provide more incentives for clean energy. Photo: Union of Concerned Scientists

Environmental Protection Agency Administrator Scott Pruitt recently proposed eliminating federal tax credits for wind and solar power, arguing that they should “stand on their own and compete against coal and natural gas and other sources” as opposed to “being propped up by tax incentives and other types of credits….”

Stand on their own? Pruitt surely must be aware that fossil fuels have been feasting at the government trough for at least 100 years. Renewables, by comparison, have received support only since the mid-1990s and, until recently, have had to subsist on scraps.

Perhaps a review of the facts can set administrator Pruitt straight. There’s a strong case to be made that Congress should terminate subsidies for fossil fuels and extend them for renewables, not the other way around.

A century (or two) of subsidies

To promote domestic energy production, the federal government has been serving the oil and gas industry a smorgasbord of subsidies since the early days of the 20th century. Companies can deduct the cost of drilling wells, for example, as well as the cost of exploring for and developing oil shale deposits. They even get a domestic manufacturing deduction, which is intended to keep US industries from moving abroad, even though—by the very nature of their business—they can’t move overseas.

All told, from 1918 through 2009, the industry’s tax breaks and other subsidies amounted to an average of $4.86 billion annually (in 2010 dollars), according to a 2011 study by DBL Investors, a venture capital firm. Accounting for inflation, that would be $5.53 billion a year today.

The DBL study didn’t include coal due to the lack of data for subsidies going back to the early 1800s, but the federal government has lavished considerably more on the coal industry than on renewables. In 2008 alone, coal received between $3.2 billion and $5.4 billion in subsidies, according to a 2011 Harvard Medical School study in the Annals of the New York Academy of Sciences.

Meanwhile, wind and other renewable energy technologies, DBL found, averaged only $370 million a year in subsidies between 1994 and 2009, the equivalent of $421 million a year today. The 2009 economic stimulus package did provide $21 billion for renewables, but that support barely began to level the playing field that has tilted in favor of oil and gas for 100 years and coal for more than 200.

A 2009 study by the Environmental Law Institute looked at US energy subsidies since the turn of this century. It found that between 2002 and 2008, the federal government gave fossil fuels six times more than what it gave solar, wind, and other renewables. Coal, natural gas, and oil benefited from $72.5 billion in subsidies (in 2007 dollars) over that seven-year period, while “traditional” renewable energy sources—mainly wind and solar—received only $12.2 billion. A pie chart from the report shows that 71 percent of federal subsidies went to coal, natural gas and oil, 17 percent—$16.8 billion—went to corn ethanol, and the remaining 12 percent went to traditional renewables.

A new study by Oil Change International brings us up-to-date. Published earlier this month, it found that federal subsidies in 2015 and 2016 averaged $10.9 billion a year for the oil and gas industry and $3.8 billion for the coal industry. By contrast, the wind industry’s so-called production tax credit, renewed by Congress in December 2015, amounted to $3.3 billion last year, according to a Congress Joint Committee on Taxation (JCT) estimate.

Unlike the fossil fuel industry’s permanent subsidies, Congress has allowed the wind tax credit to expire six times in the last 20 years, and it is now set to decline incrementally until ending in 2020. Similarly, Congress fixed the solar industry’s investment tax credit at 30 percent of a project’s cost through 2019, but reduced it to 10 percent for commercial projects and zeroed it out for residences by the end of 2021. The JCT estimates that the solar credit amounted to a $2.4-billion tax break last year. Totaling it up, fossil fuels—at $14.7 billion—still received two-and-a-half times more in federal support than solar and wind in 2016.

The costs of pollution

Subsidy numbers tell only part of the story. Besides a century or two of support, the federal government has allowed fossil fuel companies and electric utilities to “externalize” their costs of production and foist them on the public.

Although coal now only generates 30 percent of US electricity, down from 50 percent in 2008, it is still responsible for two-thirds of the electric utility sector’s carbon emissions and is a leading source of toxic pollutants linked to cancer; cardiovascular, respiratory, and neurological diseases; and premature death. The 2011 Harvard Medical School study cited above estimated coal’s “life cycle” cost to the country—including its impact on miners, public health, the environment and the climate—at $345 billion a year.

In July 2016, the federal government finally began regulating the more than 1,400 coal ash ponds across the country containing billions of gallons of heavy metals and other byproducts from burning coal. Coal ash, which has been leaching and spilling into local groundwater, wetlands, creeks, and rivers, can cause cancer, heart, and lung disease, birth defects and neurological damage in humans, and can devastate bird, fish, and frog populations.

But that was last year. Since taking office, the Trump administration has been working overtime to bolster coal, which can no longer compete economically with natural gas or renewables. Earlier this year, it rescinded a rule that would have protected waterways from mining waste, and a few months ago it filed a repeal of another Obama-era measure that would have increased mineral royalties on federal lands. More recently, Energy Secretary Rick Perry asked the Federal Energy Regulatory Commission to ensure that coal plants can recover all of their costs, whether those plants are needed or not.

Natural gas burns more cleanly than coal, but its drilling sites, processing plants, and pipelines leak methane, and its production technique—hydraulic fracturing—can contaminate water supplies and trigger earthquakes. Currently the fuel is responsible for nearly a third of the electric utility sector’s carbon emissions. Meanwhile, the US transportation sector—whose oil-powered engine exhaust exacerbates asthma and likely causes other respiratory problems and heart disease—is now the nation’s largest carbon polluter, edging out the electric utility sector last year for the first time since the late 1970s.

Like the coal industry, the oil and gas industry has friends in high places. Thanks to friendly lawmakers and administrations, natural gas developers are exempt from key provisions of seven major environmental laws that protect air and water from toxic chemicals. Permitting them to flout these critical safeguards forces taxpayers to shoulder the cost of monitoring, remediation, and cleanup—if they happen at all.

The benefits of clean energy

Unlike fossil fuels, wind and solar energy do not emit toxic pollutants or greenhouse gases. They also are not subject to price volatility: wind gusts and solar rays are free, so more renewables would help stabilize energy prices. And they are becoming less expensive, more productive, and more reliable every year. According to a recent Department of Energy (DOE) report, power from new wind farms last year cost a third of wind’s price in 2010 and was cheaper than electricity from natural gas plants.

Perhaps the biggest bonus of transitioning to a clean energy system, however, is the fact that the benefits of improved air quality and climate change mitigation far outweigh the cost of implementation, according to a January 2016 DOE study. Conducted by researchers at the DOE’s Lawrence Berkeley National Laboratory and National Renewable Energy Laboratory, the study assessed the impact of standards in 29 states and the District of Columbia that require utilities to increase their use of renewables by a certain percentage by a specific year. Called renewable electricity (or portfolio) standards, they range from California and New York’s ambitious goals of 50 percent by 2030 to Wisconsin’s modest target of 10 percent by 2015.

It turns out that it cost utilities nationwide approximately $1 billion a year between 2010 and 2013—generally the equivalent of less than 2 percent of average statewide retail electricity rates—to comply with the state standards. On the benefit side of the equation, however, standards-spawned renewable technologies in 2013 alone generated $7.4 billion in public health and other societal benefits by reducing carbon dioxide, sulfur dioxide, nitrogen oxide, and particulate matter emissions. They also saved consumers as much as $1.2 billion by lowering wholesale electricity prices and as much as $3.7 billion by reducing natural gas prices, because more renewable energy on the grid cuts demand—and lowers the price—of natural gas and other power sources that have higher operating costs.

Take fossil fuels off the dole

If the initial rationale for subsidizing fossil fuels was to encourage their growth, that time has long since passed. The Center for American Progress (CAP), a liberal think tank, published a fact sheet in May 2016 identifying nine unnecessary oil and gas tax breaks that should be terminated. Repealing the subsidies, according to CAP, would save the US Treasury a minimum of $37.7 billion over the next 10 years.

An August 2016 report for the Council on Foreign Relations by Gilbert Metcalf, an economics professor at Tufts University, concluded that eliminating the three major federal tax incentives for oil and gas production would have a relatively small impact on production and consumption. The three provisions—deductions for “intangible” drilling costs, deductions for oil and gas deposit depletion, and deductions for domestic manufacturing—account for 90 percent of the cost of the subsidies. Ending these tax breaks, Metcalf says, would save the Treasury roughly $4 billion a year and would not appreciably raise oil and gas prices.

At the same time, the relatively new, burgeoning clean energy sector deserves federal support as it gains a foothold in the marketplace. Steve Clemmer, energy research director at the Union of Concerned Scientists, made the case in testimony before a House subcommittee last March that Congress should preserve wind and solar tax incentives beyond 2020.

“Until we can transition to national policies that provide more stable, long-term support for clean, low-carbon energy,” he said, “Congress should extend federal tax credits by at least five more years to maintain the sustained orderly growth of the industry and provide more parity and predictability for renewables in the tax code.” Clemmer also recommended new tax credits for investments in low- and zero-carbon technologies and energy storage technologies.

Despite the steady barrage of through-the-looking-glass statements by Trump administration officials, scientific and economic facts still matter. Administrator Pruitt would do well to examine them. Congress should, too, when it considers its tax overhaul bill, which is now being drafted behind closed doors. If they did, perhaps they would recognize that—economically and environmentally—it would be far better for the future of the planet to phase out fossil fuel subsidies and provide more incentives for clean energy.

Trashing Science in Government Grants Isn’t Normal: The Case of the EPA, John Konkus, and Climate Change

There is now a political appointee of the Trump administration at the Environmental Protection Agency (EPA), John Konkus, reviewing grant solicitations and proposals in the public affairs office. It has been reported that Konkus is on the lookout for any reference to “climate change” in grant solicitations in attempt to eliminate this work from the agency’s competitive programmatic grants. So, is this normal?

Grants and government

The US Federal Government gave out nearly $663 billion in grant funding in fiscal year 2017. Such funding pays for a wide range of state and local government services, such as health care, transportation, income security, education, job training, social services, community development, and environmental protection. Additionally, approximately $40 billion in grant funding from federal agencies funds scientific research annually, although the amount of funding for research and development from the federal government has declined in recent years.

Given the large amount of grant funding that the federal government gives out annually, it is critical that the government has:  1) guidelines that provide guidance on what type of work the government grant will fund, and 2) a process for determining who receives funding. While each grant is unique in its considerations of what makes a good candidate for funding, there is a relatively standard process through which government grants are advertised and funded. The majority of this information can be found at www.grants.gov.

The grant solicitation

The first step in the process for funding of scientific grants is for a government agency to solicit proposals from interested parties (i.e., scientists working outside the government). The US Federal Government refers to these solicitations as “Funding Opportunity Announcements” or FOAs. These FOAs include information about what type of work the agency is expecting and whether or not the applicant would be eligible for funding. Thus, an FOA is extremely important for both the government and the applicant because it highlights the agency’s priorities for the funding, which also serves as a guideline for an applicant’s proposal.

The agency must first consider what type of work is currently needed in the US. In the case of science, the agency assesses what is currently unknown in our scientific knowledge on a given subject. Additionally, agencies will determine what special considerations are needed to make the grant work more impactful—these may include work that focuses on environmental justice or coal communities, for example. These considerations are typically discussed in the FOAs, and grants that include these special considerations in their proposals are typically ranked as more competitive relative to others that do not.

The FOA is reviewed by a panel of experts, which consist of career officials across the federal government for most agencies. It isn’t uncommon for political appointees to review an FOA. Political appointees generally broaden the FOA so it’s more inclusive, asking questions such as, “do you think that we might want to consider adding a special consideration for communities recently affected by natural disasters?” What does seem to be uncommon is eliminating scientifically defensible language like the “double C word.”

Reviewing grant proposals and awards

At many federal agencies, grants are reviewed by career agency staffers who have expertise for the grant program. However, in the cases of the National Science Foundation and the National Institutes of Health, a panel of non-federal scientists who have scientific expertise in the relevant research areas and scientific disciplines review submitted proposals. All proposed federal grants typically go through a first round review where they are screened for eligibility. If the proposal does not meet eligibility criteria, it is not reviewed further.

Those proposals that are eligible for funding are then reviewed by a panel of career agency staffers who are experts for the grant program’s work. The proposals are evaluated based on criteria specific to the grant – for some programmatic grants these criteria are dictated by statutory authority (e.g., grants in the brownfields program at the EPA). Based on these criteria, the panel scores each proposal. The proposals that receive higher scores are deemed more competitive relative to those with lower scores.

Depending on the amount of funding available for a grant program, the panel will recommend a percentage of the top scoring grants to be funded. The panel also takes into consideration other factors that may have been emphasized in the FOA (e.g., a community that was just ravaged by a natural disaster that is in greater need of funding relative to other communities).

The recommended set of proposals for funding are then sent to the head of the program, which can be a political appointee of a presidential administration. The amount of information on recommendations that the appointee might receive varies. Sometimes the appointee might receive abstracts of proposals or they might just receive a list of the institutions or researchers recommended for funding. The appointee typically agrees with the recommendation of the expert panel. It would be uncommon for the political appointee to not fund a proposal recommended for funding, as is being done by Konkus.

Ignoring science in grants will harm others

Is it uncommon for political appointees to have a say in the grant funding process? No. What is uncommon is for political appointees to politicize science in grant solicitation language or in rescinding proposals that were recommended by a panel of experts. As former EPA administration Christine Todd Whitman chimed in on this issue, “We didn’t do a political screening on every grant, because many of them were based on science, and political appointees don’t have that kind of background.”

As is common with this administration, we are seeing proposals that mention the “double C word” as a target. Konkus rescinded funding from EPA to two organizations that would have supported the deployment of clean cookstoves in the developing world—a simple solution to curb the impacts of climate change, but also to limit pollution that disproportionately affects women and children in these areas. Who knows what Konkus will rescind next, but it’s likely to have harmful effects on people. Maybe Konkus should leave decisions for funding up to the expert panels. They are categorized as experts for a reason.

The 5 Worst Plays From Industry’s Disinformation Playbook

When I was 13, this is what I identified as the hardest thing about life then. My trust issues were just beginning to manifest themselves.

I have always had a healthy dose of curiosity and skepticism and a desire to hold people accountable for their statements built into my DNA. Usually, these were borne out in letter-writing campaigns. As a child, I sent a series of letters to the Daily News because I believed its campaign of “No More Schmutz!” was falling short after rifling through the pages and still having gray smudges on my fingers. Inky fingers is a far cry from misinformation about the dangers of fossil fuel pollution, but overall, my general pursuit for the truth hasn’t changed.

My newest project at the Center for Science and Democracy released today is a website that exposes the ways in which companies seek to hide the truth about the impacts of their products or actions on health and the environment. By calling out the plays in the “Disinformation Playbook,” we hope to ensure that powerful companies and institutions are not engaging in behavior that would obstruct the government’s ability to keep us safe, and at the very least aren’t doing us any harm. Unfortunately, as our case studies show, there are far too many examples in which companies and trade organizations have made intentional decisions to delay or obstruct science-based safeguards, putting our lives at risk.

In the Disinformation Playbook, we reveal the five tactics that some companies use to manipulate, attack, or create doubt about science and the scientists conducting important research that appears unfavorable to a company’s products or actions. We also feature twenty case studies that show how companies in a range of different industries have used these tactics in an effort to sideline science.

While not all companies engage in this irresponsible behavior, the companies and trade associations we highlight in the playbook have acted in legally questionable and ethically unsound ways. Here are five of the most egregious examples from the Playbook:

The Fake: Conducting counterfeit science

In an attempt to reduce litigation costs, Georgia-Pacific secretly ran a research program intended to raise doubts about the dangers of asbestos and stave off regulatory efforts to ban the chemical. The company used knowingly flawed methodologies in its science as well as publishing its research in scientific journals without adequately disclosing the authors’ conflicts of interest. By seeding the literature with counterfeit science, Georgia-Pacific created a life-threatening hazard by deceiving those who rely on science to understand the health risks of asbestos exposure. While asbestos use has been phased out in the US, it is not banned, and mesothelioma still claims the lives of thousands of people very year.

The Blitz: Harassing scientists

Rather than honestly dealing with its burgeoning concussion problem, the National Football League (NFL) went after the reputation of the first doctor to link the sport to the degenerative brain disease he named Chronic Traumatic Encephalopathy. What Omalu found in Mike Webster’s brain—chronic traumatic encephalopathy (CTE), a progressive degenerative disease mainly associated with “punch drunk” boxers and victims of brain trauma—broke the NFL’s burgeoning concussion problem wide open. But instead of working with scientists and doctors to better understand the damaging effect of repeated concussions and how the league could improve the game to reduce head injuries, the NFL went after the reputation of Omalu and the other scientists who subsequently worked on CTE. Just this year, Boston University scientists released a study of 111 deceased former NFL players’ brains, revealing that all but one had signs of CTE.

The Diversion: Sowing uncertainty

The top lobbyist for the fossil fuel industry in the western United States secretly ran more than a dozen front groups in an attempt to undermine forward-looking policy on climate change and clean technologies. As a leaked 2014 presentation by Western States Petroleum Association (WSPA) President Catherine Reheis-Boyd revealed, WSPA’s strategy was to use these fabricated organizations to falsely represent grassroots opposition to forward-looking policy on climate change and clean technologies. WSPA and its member companies oppose science-based climate policies that are critically needed to mitigate the damaging impacts of global warming.

The Screen: Borrowing credibility

Coca-Cola quietly funded a research institute out of the University of Colorado designed to persuade people to focus on exercise, not calorie intake, for weight loss. Emails between the company and the institute’s president suggested Coca-Cola helped pick the group’s leaders, create its mission statement, and design its website. A growing body of scientific evidence links sugar to a variety of negative health outcomes, including diabetes, cardiovascular disease, and high cholesterol. Coca-Cola’s actions overrode sensible transparency safeguards meant to ensure the independence of research—and allow consumers to understand the risks of sugar consumption for themselves.

The Fix: Manipulating government officials

After meeting with and listening to talking points from chlorpyrifos producer Dow Chemical Company, the EPA announced it would be reversing its decision to ban the chemical that is linked to neurological development issues in children. In 2016, the EPA concluded that chlorpyrifos can have harmful effects on child brain development. The regulation of chlorpyrifos is additionally an environmental justice issue. Latino children in California are 91 percent more likely than white children to attend schools near areas of heavy pesticide use.

The secret to a good defense is a good offense

By arming ourselves with independent science, we can fight back against these familiar tactics. Granted, it’s not an easy task, especially in the face of a government run by an administration that doesn’t appear to value evidence, believing asbestos is 100% safe and claiming that climate change is a hoax. I hear powerful stories every day of communities working together to crush corporate disinformation campaigns with the hard truth.

Just a couple of weeks ago, community members from Hoosick Falls, New York attended the hearing for “toxicologist-for-hire,” Dr. Michael Dourson, the nominee to head up the EPA’s Office of Chemical Safety and Pollution Prevention. Senator Kristen Gillibrand paid homage to their bravery, “The water that they drink, the water they give their children, the water they cook in, the water they bathe in, is contaminated by PFOA. These families are so frightened.” These individuals had a powerful story to tell about the way in which DuPont downplayed the dangers of the chemical byproduct of Teflon, C8 or PFOA, and Dourson’s consulting firm, hired by DuPont, recommended a far lower standard for the chemical than most scientists believe would have protected exposed residents from harm.

We hope that reading through the playbook will encourage you to stand up for science and join us as we continue to challenge companies that attempt to sideline science, seeking business as usual at our expense. Become a science champion today and take a stand against corporate disinformation by asking your members of congress not to do automakers’ bidding by rolling back valuable progress on vehicle efficiency standards.

Stay curious, stay skeptical, and together we can work toward making corporate culture more honest and transparent by raising the political cost of using the disinformation playbook.

Up Close with America’s New Renewable Energy: Experiencing the Now-ness of Offshore Wind

Block Island Wind Farm. Photo: E. Spanger-Siegfried

On a recent clear day, colleagues and I hopped on a boat for a look at our nation’s energy future. From right up close, offshore wind turbines make quite an impression. The biggest impression, though? That the future of energy… is actually right now.

Seeing is believing

The boat tour gave us a chance to be out on the water in the vicinity of the turbines of Rhode Island’s Block Island Wind Farm, the first offshore wind facility in the Americas. And what first stood out in that trip was… well, the wind turbines.

Block Island Wind Farm: Seeing is believing. Photo: J. Rogers.

Sight. Yes, these things are no shrinking violets. The mechanical engineer in me is drawn inexorably to the stats that define that heft, facts about the size of each the five 6-megawatt turbines that make up the wind farm. About the lengths/heights—of the towers (360 feet up from the ocean’s surface), the foundation (90 feet down to the seabed, then 200 feet beyond), the blades (240 feet from hub to tip). About the weight—1500 tons for the foundation, 800 more for the tower, the nacelle (the box up top), and the blades.

The poet in me, if there were one, would wax lyrical (and poetical) about the visuals of the trip. I can at least say this: I know that beauty is in the eye of the beholder, but this beholder was quite taken with the towering figures just feet away as we motored by, and, as far as I could tell, my fellow travelers/beholders shared that sentiment.

The turbines don’t just look solid and mechanical and useful. They look like art—graceful, kinetic sculptures rising from the briny depths.

Beyond seeing, and seeing beyond

This tour wasn’t just about seeing, though. With a trip this exciting, you want to bring multiple senses to bear, and so we did.

Offshore wind power – Big, bold, beautiful, and ready for its close-up. Photo: E. Spanger-Siegfried.

Sound. Surprisingly, given the size of each installation, sound was not really a piece of the turbine-gazing experience. That is, I could maybe hear the blades turning, but only barely, over the noise of the ship’s engine and, particularly, over the sound from the very wind that was exciting those blade rotations.

Scent. The scent on the water was of the sea air, which I don’t normally get and which I’d welcome any day. When you get close enough to see the bolts and welds on the foundations and towers, though, these wind turbines smell like jobs.

The workmanship that went into these marvels is clear. Looking at each, you can easily imagine the workers, local, abroad, and in-between, that made this possible.

While many of the major components for this first-in-the-nation wind farm came from factories in established offshore wind farm markets, it was welders in Louisiana who gave birth to the foundation, using manufacturing skills wisely transferred from the offshore oil/gas industry. And the pieces all came together courtesy of ironworkers, electricians, and more in Rhode Island—some 300 local workers, says project developer Deepwater Wind.

Offshore wind admirers. Photo: J. Rogers.

Touch. Much as I would have enjoyed getting right on the turbines (and maybe even on top?), our passage by understandably left us a few tens of feet short of that. (Next time.)

But my fellow travelers and I were clearly touched by the experience of seeing such power right up close, could easily feel the transformative energy of each turbine.

Taste. That leaves one more sense. This trip wasn’t just about the taste of the salty air. It communicated the sense that what we got on the water on that recent fall day was just a taste of what’s to come. Maybe, then, we can couple that with a sixth sense: a sense of optimism.

Because it’s hard to stand there on the rising-falling deck, with the sun, the wind, and the sea spray, with those powerful sculptures so close by, and not get a sense that you’re witnessing a special something. A something that goes beyond five turbines, big as they are, and beyond 30 megawatts and the 17,000 homes that they can power. A sense that’s there much more beyond.

One of the local leaders from the electricians union (IBEW) captured this beyond idea well in talking about the project from the point of view of jobs, and the economic development potential of this technology:

Offshore wind: The future is present. Photo: J. Rogers.

“The real prize was not the five turbines… The real prize is what’s going to come.”

When it comes to offshore wind turbines, the what’s-to-come seems as big and powerful as each turbine multiplied many-fold. We seem poised for so much more, not just abroad, but right here at home.

A video of the Block Island project from proud project financier Citi can get you close to this particular project, and this cool 360 version of the turbines courtesy of the New York Times can get you even closer (just hold on tight!).

But for readers in this country, the fact that we’re poised for much more means that a chance to visit a wind farm in waters near you could be coming soon.

And if you do get there, use as many senses as you can. Offshore wind power is an experience worth getting close to, and opening up to.

The print version of Citi’s Block Island promotion includes the tagline “On a clear day you can see the future”. But getting up close to offshore wind turbines makes it clear that this particular energy technology is here and now. That it’s so ready for the big time. That yesterday’s energy future is today’s energy present.

So go ahead, on clear days, or cloudy, rain or shine: See, hear, smell, touch, and taste that energy-future-in-the-present. And celebrate.

Michael Dourson: A Toxic Choice for Our Health and Safety

When it comes to conflicts of interest, few nominations can top that of Michael Dourson to lead the EPA’s Office of Chemical Safety and Pollution Prevention. Time after time, Dourson has worked for industries and against the public interest, actively downplaying the risks of a series of chemicals and pushing for less stringent policies that threaten our safety.

In short, Dourson pushes counterfeit science, is unfit to protect us from dangerous chemicals, and is a toxic choice for our health and safety.

A litany of toxic decisions

Consider the 2014 Freedom Industries chemical spill into the Elk River in Charleston, West Virginia, which contaminated drinking water supplies for 300,000 people with MCHM and PPH—two chemicals used to clean coal.

After the spill, Dourson’s company, TERA, was hired by the state to put together a health effects panel; Dourson was the chair. He did not disclose the fact, however, that he had been hired to work for the very same companies that manufactured those chemicals, a fact that only later came out while he was being questioned by a reporter.

Dourson was also involved in helping set West Virginia’s “safe” level in drinking water for a chemical used to manufacture Teflon (Perfluorooctanoic acid (PFOA), or “C8”). It was 2,000 times higher than the standard set by the EPA.

In 2015, Dourson provided testimony for DuPont in the case of a woman who alleged that her kidney cancer was linked to drinking PFOA-contaminated water from the company’s Parkersburg, WV plant. Just this year, DuPont settled with plaintiffs from the Ohio Valley who had been exposed to PFOA from the same plant for $670 million after an independent C8 Science Panel made up of independent epidemiologists found “probable links” between PFOA and kidney and testicular cancer, as well as high blood pressure, thyroid disease, and pregnancy-induced hypertension.

From 2007 to 2014, Dourson and TERA also worked closely with Michael Honeycutt and the Texas Commission on Environmental Quality to loosen two-thirds of the already weak protections for 45 different chemicals.

The list of toxic decisions made by Dourson and his team goes on and clearly makes him an unacceptable choice for a leadership role at the agency charged with protecting public health and the environment.

A worst-case choice

During Dourson’s hearing before the Senate Committion on Environment and Public Works (EPW), his answers to questions about recusing himself from decisions regarding former chemicals on which TERA has worked closely with industry were cagey at best. It appears that because much of his work was through the University of Cincinnati, he will not be expected to recuse himself from decisions about chemicals that his research team was paid by industry to assess in the past. His ethics agreement confirms this. So much for the Trump administration’s draining of the swamp.

If Dourson is confirmed, I have no doubt that his appointment will be repeatedly cited as a worst-case-scenario of the revolving door between industry representatives and the government.

His work over the past few decades has been destructive enough, even from a position with little power to help the chemical industry directly skirt tough regulations. Putting him in charge of the office that is supposed to protect the public from toxins would be a grave mistake with national ramifications.

In the coming years, Dourson’s office will be making decisions about safety thresholds and key regulatory actions on chlorpyrifos, neonicotinoids, flame retardants, asbestos, and the other priority chemicals under the Toxic Substances Control Act. There is no room for error, and unfortunately, error is likely with someone like Dourson in charge.

We join with many other members of the scientific community to oppose Michael Dourson for Assistant Administrator of OCSPP and to ask senators to vote no in upcoming committee and confirmation votes.

Pruitt Steps Up His Attack on Biofuel Policies

Molecular biologist Z. Lewis Liu (foreground) and technician Scott Weber add a new yeast strain to a corncob mix to test the yeast’s effectiveness in fermenting ethanol from plant sugars. Photo: U.S. Department of Agriculture (USDA) Agricultural Research Service (ARS) CC-BY-2.0 (Flickr)

It was just 6 weeks ago I last posted on how Pruitt’s EPA Undermines Cellulosic Biofuels and Transparency in Government, and I hoped to shift my attention to other topics.  But in late September, the EPA Administrator Pruitt stunned the biofuels world by releasing a rulemaking document (called a Notice of Data Availability or NODA) suggesting he planned to cut more deeply into the Renewable Fuel Standard (RFS) 2018 targets for advanced biofuels and biodiesel than had been previously indicated.

The NODA linked the changes to tariffs recently imposed on imports on soy-based biodiesel from Argentina and palm oil biodiesel from Indonesia, but citations in the NODA make it plain that this request comes directly from the oil refiners.

There are also rumors that EPA may count ethanol that is already being exported toward compliance with the standard, which would also reduce the obligations for refineries to blend ethanol or other biofuels into the fuel they sell.  Overall, these changes upend the basic understanding of the goals and requirements of the RFS and seem intended primarily to reduce costs for refineries.

UCS does not support the approach to NODA suggests.  This might seem odd, since we have been arguing against the increased use of both corn ethanol and vegetable oil based biodiesel for many years.  But while there are plenty of problems with food-based biofuels, ignoring the law and considering only how to reduce costs for oil refiners is not the way to fix them.

UCS has opposed discretionary enlargement of biodiesel mandates beyond statutory levels

Some parts of the RFS offer more benefits than others.  Cellulosic biofuels can expand biofuel production with greater climate benefits and lower environmental costs than food-based biofuels like corn ethanol and vegetable oil biodiesel.  But cellulosic biofuels have not scaled up nearly as fast as the RFS envisioned, which left the EPA to decide whether to backfill the shortfall of cellulosic biofuels with other biofuels, especially biodiesel.

Since 2012 we have argued that the EPA should not make discretionary enlargements to the advanced biofuel mandate to replace the shortfall of cellulosic fuels without careful consideration of potential unintended consequences.

Even without a discretionary enlargement, the minimum statutory levels of advanced biofuels that Congress specified in the RFS are ambitious, and are drawing heavily on available sources of vegetable oil and waste oils (called feedstocks) to make biodiesel and renewable diesel, which, as Scott Irwin and Darrel Good at FarmDocDaily have explained,  have for several years been provided the marginal gallon of biofuel to meet the mandates for conventional and advanced biofuel under the RFS.

Analysis we commissioned in 2015 and more recent analysis from the International Council on Clean Transportation suggest there are not sufficient feedstocks to support higher levels of production.  As I have explained in previous posts and a technical paper, the indirect effect of large expansion of biodiesel is to expand demand for palm oil, which has environmental harms that outweigh the benefits of offsetting diesel use in the U.S.

But we don’t support Pruitt’s effort to cut mandates below statutory levels

It might seem logical that if expanding mandates is a bad idea, then cutting them must be a good idea.  One can certainly make a logical argument that cutting the RFS advanced biofuel mandate will reduce demand for vegetable oil which could result in lower overall demand for palm oil and hence reduce deforestation in Southeast Asia. But there are two big problems with this approach.

First, what Pruitt is proposing is clearly inconsistent with the law.  Despite repeated claims that he will follow the law, the administrator’s actions are subverting the basic goal of the Renewable Fuel Standard, which is to expand the market for biofuels.

Until Congress updates it, the Renewable Fuel Standard is the law, and UCS’ input to the EPA has always focused on how EPA can maximize climate benefits consistent with the law.  We explained why exceeding the minimum statutory levels for food-based biofuels would have unintended consequences, but have not argued that EPA should go below these levels because this is clearly inconsistent with the law.

When corn prices spiked back in 2012, we supported a temporary RFS waiver, which was both consistent with the waiver provisions of the law and supported by the circumstances.  But today we are not facing a crisis in grain, vegetable oil or fuel markets.  Jonathan Coppess and Scott Irwin at FarmDocDaily have evaluated legal and economic grounds to waive the standard, and found no compelling case. Rather, we have a crisis in leadership – in the White House and at the EPA, where Administrator Pruitt is hostile to the basic goals of the agency he leads.  In that context, Pruitt’s proposed actions seem less like an opportunity to reduce the harms of food-based biofuels than a clear subversion of the basic goals of the law in the service of oil industry profits.

Second, political games are risky, and in the present context, climate advocates have a lot more to lose than to gain.  President Trump made repeated promises to protect ethanol, which stands in stark contrast to his position on protecting the United States from climate change.

Pruitt has been not very subtly hinting at a deal whereby the Trump administration promotes ethanol exports and treats ethanol favorably in upcoming fuel economy standards in exchange for their acquiescence to weakening the RFS.  Trading the RFS for loopholes in fuel economy standards would be a bad deal for the future of the biofuels industry and a terrible deal for the environment.

A previous loophole added to fuel economy regulations to promote ethanol sales was a failure, which ultimately did much more to increase gasoline use by making cars less efficient than to expand ethanol use.  A long-term future for the biofuels industry depends on avoiding counterproductive outcomes and helping to cut oil use, and Pruitt is clearly not headed in this direction.  While there is some similarity between UCS’s specific guidance on biodiesel targets and Pruitt’s latest pivot on the RFS, we strongly object to his approach to cellulosic biofuels, his narrow vision for the RFS that focuses solely on current fuel prices, and the direction Pruitt is taking the EPA.

Blowing up current biofuels policy is not much of a plan

Some who support climate policy espouse the idea that the RFS is a failed policy, and that it is mostly just a giveaway to agricultural interests, so letting it collapse it not much of a loss.  I disagree. The RFS is certainly shaped by the political power struggle between the oil industry and the biofuels industry/agriculture, but it also includes important environmental protections.  For example, the RFS requires that future biofuel expansion comes from advanced fuels that cut emissions at least 50% compared to gasoline.  But with the environmental goals of the policy sidelined by the hostile takeover of the EPA by Administrator Pruitt, the current battle comes down to a stark choice between working with the oil industry to undermine the basic structure of the RFS, or keeping that framework intact until we have an opportunity to meaningfully improve it

New laws generally build upon existing legal frameworks, and, if it survives, the RFS is likely to be the foundation on which future fuels policies are built.  If the RFS dies under the knife of the Pruitt EPA, the concessions the Trump administration offers the ethanol industry will not include the environmental protections in the RFS, however imperfect.  Moreover, the RFS and state fuel policies support one another, and if the RFS is weakened it will make the California and Oregon clean fuel policies more challenging and expensive.

UCS is not lending our support to Pruitt’s lawless approach to rewriting our vehicle and fuel policies.  Instead we will defend existing laws and build upon them once we have an administrator who understands that the core mission of the Environmental Protection Agency is to protect the environment rather than doing the bidding of the oil industry and other polluters.

New UCS Report Finds High Health Risks in Delaware Communities from Toxic Pollution

Refineries, such as the Delaware City Refinery shown here, can emit toxic chemicals that can increase risks for cancer and respiratory disease.

For decades residents of communities in Wilmington, Delaware’s industrial corridor have dealt with high levels of pollution. People in these communities, which have higher percentages of people of color and/or higher poverty levels than the Delaware average, are also grappling with health challenges that are linked to, or worsened by, exposure to pollution, such as strokes, heart diseases, sudden infant death syndrome, and chronic childhood illnesses such as asthma, learning disabilities, and neurological diseases. These are some of Delaware’s environmental justice communities.

To assess the potential link between environmental pollution and health impacts in these communities, the Center for Science and Democracy at UCS collaborated with the Environmental Justice Health Alliance, Delaware Concerned Residents for Environmental Justice, Community Housing and Empowerment Connections, Inc. and Coming Clean, Inc. Analysis of the following health and safety issues using Environmental Protection Agency (EPA) data were conducted:  the risk of cancer and potential for respiratory illnesses that stem from toxic outdoor air pollution; proximity of communities to industrial facilities that use large quantities of toxic, flammable, or explosive chemicals and pose a high risk of a major chemical release or catastrophic incident; proximity of communities to industrial facilities with major pollution emissions; and proximity of communities to contaminated waste sites listed in EPA’s Brownfield and Superfund programs.

The seven communities analyzed—Belvedere, Cedar Heights, Dunleith, Marshallton, Newport, Oakmont, and Southbridge—were compared to Greenville, a predominantly White and affluent community located outside the industrial corridor, and to the population of Delaware overall. The findings from this analysis have been published in a new report titled Environmental Justice for Delaware: Mitigating Toxic Pollution in New Castle County Communities.

Proximity to major pollution sources and dangerous chemical facilities

TABLE 5. Sources of Chemical Hazards and Pollution in Environmental Justice Communities Compared with
Greenville and Delaware Overall. Note: All facilities are located within 1 mile of communities.
SOURCE: Environmental Protection Agency (EPA). No date (i). EPA state combined CSV download files. Online at www.epa.gov/enviro/epastate-combined-csv-download-files, accessed May 18, 2017.

Dunleith and Oakmont have several Brownfield sites and are in close proximity to facilities releasing significant quantities of toxic chemicals into the air. Southbridge has, within its boundaries or within a one-mile radius around it, two high-risk chemical facilities, 13 large pollution-emitting industrial facilities, four Superfund sites, and 48 Brownfield sites. Southbridge is home to more than half of all Brownfields in Delaware. Cedar Heights and Newport also have several large pollution-emitting facilities within one mile as well as being close to two EPA Superfund contaminated waste sites.

Effects of toxic air pollution on cancer risks and the potential for respiratory illnesses

TABLE 2. Cancer Risks for Environmental Justice Communities Compared with Greenville and Delaware Overall
Note: Cancer risk is expressed as the incidences of cancer per million people. For the respiratory hazard index, an index value of 1 or less indicates a level of studied pollutants equal to a level the EPA has determined not to be a health concern, while a value greater than 1 indicates the potential for adverse respiratory health impacts, with increasing concern as the value increases. SOURCE: Environmental Protection Agency (EPA). 2015. 2015 National Air Toxics Assessment. Washington, DC. Online at www.epa.gov/national-air-toxics-assessment, accessed May 18, 2017.

Of the seven environmental justice communities studied, people in Marshallton face the highest cancer and respiratory health risks. Cancer and respiratory health risks there are 33 and 71 percent higher, respectively, than for the comparison community Greenville, and are 28 and 55 percent higher than for Delaware overall.

The communities of Dunleith, Oakmont, and Southbridge, whose residents are predominantly people of color and have a poverty rate approximately twice that of Delaware overall, have cancer risks 19 to 23 percent higher than for Greenville and 14 to 18 percent higher than for Delaware overall. Respiratory hazard in these three communities is 32 to 43 percent higher than for Greenville and 20 to 30 percent higher than for Delaware overall.

For Newport, Belvedere, and Cedar Heights, which have a substantial proportion of people of color and poverty rates above the Delaware average, cancer risks are 21, 15, and 12 percent higher than for Greenville, respectively, and are 16, 10, and 7 percent higher than for Delaware overall. Respiratory hazard in Newport, Belvedere, and Cedar Heights is 44, 30, and 24 percent higher than for Greenville, respectively, and 31, 18, and 13 percent higher than for Delaware overall.

Children at risk

Kenneth Dryden of the Delaware Concerned Residents for Environmental Justice and a former Southbridge resident leads a tour of toxic facilities to teach scientists and community members about the dangers of local air pollution.

Children are especially vulnerable to the effects of toxic air pollution. Particularly concerning is that seven schools within one mile of Southbridge, with a total of more than 2,200 students, are in locations with substantially higher cancer risks and potential respiratory hazards than schools in all other communities in this study.

In addition to having daily exposure to toxic pollution in the air, children in these communities are at risk of being exposed to toxic chemicals accidentally released from hazardous chemical facilities in or near their communities. For example, the John G. Leach School and Harry O. Eisenberg Elementary School near Dunleith, with a total of 661 students, are located within one mile of a high-risk chemical facility.

Achieving environmental justice for vulnerable communities

Using multiple EPA data bases, the findings of this study indicate that people in the seven communities along the Wilmington industrial corridor face a substantial potential cumulative health risk from (1) exposure to toxic air pollution, (2) their proximity to polluting industrial facilities and hazardous chemical facilities, and (3) proximity to contaminated waste sites. These health risks are substantially greater than those for residents of a wealthier and predominantly White Delaware community and for Delaware as a whole.

This research provides scientific support for what neighbors in these communities already know—that they’re unfairly facing higher health risks. We need to listen to communities and the facts and enact and enforce the rules to protect their health and safety. Environmental justice has to be a priority for these and other communities that face disproportionately high health risks from toxic pollution.

Ron White is an independent consultant providing services in the field of environmental health sciences. Mr. White currently is a Senior Fellow with the Center for Science and Democracy at the Union of Concerned Scientists, and also holds a part-time faculty appointment in the Department of Environmental Health and Engineering at the Johns Hopkins Bloomberg School of Public Health. He earned his Master of Science in Teaching degree in environmental studies from Antioch University, and a Bachelor of Arts degree in environmental science from Clark University.  

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Why Cap and Invest Is the Right Solution for Massachusetts Transportation

Boston Traffic. Photo: Sarah Nichols CC-BY-2.0 (Flickr)

Over the past decade, Massachusetts has helped lead the nation towards clean and renewable sources of electricity.

Under the Global Warming Solutions Act, Massachusetts established the strongest legally binding limits on global warming pollution in the country. Massachusetts leadership helped establish the first regional limits on pollution from power plants through the Regional Greenhouse Gas Initiative. We have the most efficient economy in the country, saving consumers millions on our energy bills. We have abolished the use of coal, we have created over 100,000 clean energy jobs, and last year Massachusetts made an investment in offshore wind that will make us a national leader in that technology.

But most of the progress that we have made is in the electricity sector. When it comes to transportation, we have tried but struggled to make overall progress.

Our cars and trucks, rather than our power plants, are now the largest sources of pollution in Massachusetts. Every year, pollution from transportation causes over 3,000 asthma attacks, 500 preventable deaths and $1.3 billion in combined health costs in the state. While emissions from electricity is overall down by 58%, in transportation emissions are about the same as they were in 1990.

To their credit, the Baker administration has recognized that meeting our long-term climate mandates requires more ambitious action to control transportation emissions. The Baker administration has announced four listening sessions to solicit ideas for how to address pollution from transportation. According to Energy and Environmental Affairs Secretary Matt Beaton, “Our next target for new policies that will lead to further reductions is the transportation sector and we’re looking forward to rolling up our sleeves and finding solutions.”

Massachusetts needs a better, cleaner transportation system

Of course, pollution is one of many challenges facing Massachusetts’ transportation system these days.

You can’t pick up a newspaper today without reading about some of the challenges affecting transportation in the state: our public transportation services are underfunded and overcrowded, our roads are among the most congested in the country, our transportation agencies are broke, low income communities are poorly served by transit, communities near public transportation are increasingly unaffordable.

Here are just a few articles that have been written about some of the challenges affecting transportation and housing in Massachusetts over the past month:

What these stories have in common is that they all show the critical role that the inter-related issues of transportation, housing and climate change will play in the future of Massachusetts.

Massachusetts needs a public transportation system that businesses and workers can rely on to connect people to jobs and opportunities. We need to be able to provide enough affordable housing near transit to retain talented young professionals and protect low-income residents from displacement and gentrification. As recent storms have demonstrated, we need to protect our transportation system from the impacts of a changing climate by keeping our infrastructure in good repair. And to achieve our climate goals, we need to transition virtually our entire vehicle fleet to cars and trucks that do not pollute.

We need, in short, dramatic and transformative change in our transportation system.

We can do better

The good news is that today we have more tools at our disposal to address transportation challenges than ever before. Exciting technologies such as electric vehicles offer the promise of cars and trucks and buses that can operate without tailpipe emissions and that can be powered by clean energy. Thanks to our relatively clean grid, in Massachusetts EVs can get the emissions equivalent of a 100 mpg vehicle.

New transportation modes such as ride-sharing and automated vehicles open up new possibilities for greater system efficiency – as well as potential new challenges that will need to be addressed through smart policy. Transit ridership is growing faster in Boston than any other major transit system. And a younger generation is coming of age that shows ever greater interest in transit, cycling, and urban living.

Together, these present-day technologies and trends point towards a possible future still on the horizon, if we make the right investments today in clean transportation.  A transportation system that does more but costs less and pollutes less. Where a network of shared, electric vehicles, working in concert with a first-class public transportation system, gets everybody where they need to go without burning a gallon of gasoline or getting stuck for an hour in traffic.

So how do we get there from here?

Obviously no single policy has the ability to address all the challenges facing our transportation sector. Creating a better, cleaner transportation system will require multiple policies and coordination between state and local government and key

But one great place to start would be to join with the other states in the Northeast in launching a cap and invest program modeled after the Regional Greenhouse Gas Initiative (RGGI) covering transportation emissions.

RGGI is a program with a track record of success in reducing emissions while growing the economy and saving consumers money. Under RGGI, the Northeast region has established limits on emissions from power plants, limits that must decline every year. These limits are enforced through a requirement that power generators purchase allowances from within a limited pool. The funds generated by these allowance sales are then invested in clean energy and efficiency programs.

RGGI is a funding source for a variety of programs that have saved money and improved lives in Massachusetts.

Funding from RGGI is used to support the MassSave program, which has provided home energy audits and rebates for home retrofits and energy efficient appliances for thousands of households across the Commonwealth. Through the Green Communities Act, RGGI helps engage local government and local grassroots activists around concrete local energy projects in 155 communities across Massachusetts, such as upgrading the boiler at the local school or putting in LED streetlights.

By investing in efficiency, RGGI has saved consumers over $600 million on their energy bills – with billions in additional savings expected in years to come. Overall, RGGI has helped cut emissions in the Northeast region by over 37 percent, while expanding the Northeast economy by $2.9 billion. In Massachusetts, RGGI has produced over $1 billion in health benefits and created over 2,000 jobs.

What would cap and invest mean for Massachusetts?

The biggest limitation of the RGGI program is that it only applies to power plants. But other jurisdictions, including California, Ontario and Quebec, have successfully expanded the cap and invest program model to include transportation fuels, and the result has been billions of dollars in new investments in clean transportation.

California, for example, is projected to spend over $2 billion on clean transportation and affordable housing investments over the next year. These investments will go to a variety of programs designed to increase access to clean mobility solutions for California residents, including:

  • Expansion of light rail service in every major metropolitan area.
  • Improved bus service, including zero-emission bus service, in dozens of cities, towns and rural counties.
  • Aggressive incentive programs to make it easier for low-income residents to trade in inefficient vehicles for hybrids or electric vehicles.
  • Investments in affordable housing near public transportation.

A cap and invest program covering transportation emissions could potentially raise up to $4.7 billion in funding for similar programs in the Northeast. For Massachusetts, that could mean over $120 million per year in dedicated funding for clean vehicle incentives, $120 million in affordable housing initiatives, and $225 million to improve public transportation.

Lets make it happen

We can have a cleaner and more efficient transportation system in Massachusetts and other Northeast states – and with policy leaders looking closely at bringing cap and invest into transportation, now is the time to engage in this effort.

Massachusetts will conduct four listening sessions over the next few weeks to generate feedback from the public on clean transportation. These sessions will be held:

  • Tuesday, October 31, 9:00am, State Transportation Building, 10 Park Plaza, Boston, MA
  • Thursday, November 2, 6:00pm, MassDEP Central Region Office, 8 New Bond Street, Worcester, MA
  • Monday, November 6, 11:00am, UMass-Amherst, Student Union – Cape Cod Lounge, 280 Hicks Way, Amherst, MA
  • Thursday, November 9, 6:00pm, West Middle School, 271 West Street, Brockton, MA

Advocates will also be hosting a webinar to talk in more detail about the proposed policy.

We encourage everyone with a stake in a better transportation system in Massachusetts – which is to say, everyone in Massachusetts – to come to these events and make their voices heard.

 

 

 

In New Mexico, Facing the Question of What Comes After Coal

Photo: WildEarth Guardians/Creative Commons (Flickr)

Change is coming to New Mexico.

As recently as 2011, coal accounted for more than 70 percent of in-state electricity generation; now it’s under 60 percent, and falling fast. Coal simply cannot compete in the face of cleaner, cheaper resources coming online.

But with this change comes opportunity. New Mexico has a chance now, before its coal plants and coal mining operations have closed, and before jobs have been lost, to chart an intentional path toward a clean energy future that is considerate of both the benefits and challenges that such a transition will bring. By committing to an energy plan dominated by renewables, policymakers in the state can secure good jobs, significant capital investment, and a brighter, cleaner, and healthier world for all New Mexicans.

And as highlighted in our new analysis, Committing to Renewables in New Mexico: Boosting the State’s Economy, Generating Dividends for All, this can all be achieved while keeping costs for consumers affordable, and electricity service reliable.

Recognizing the imminent transition ahead

In New Mexico, it is no longer a question of whether the state’s coal plants will retire, but when. This past summer, the state’s largest utility, Public Service Company of New Mexico (PNM), concluded that its most cost-effective portfolio of resources was the one that was entirely coal free. From a company that had been just a year prior staunchly defending its need to keep coal plants running, this announcement marked a stunning turn.

The question that follows, though, is what gets built to fill the gaps?

New Mexico has a nearly unparalleled array of renewable resource potential available to it, from strong and steady winds, to countless days of uninterrupted sun, to ready access to geothermal. These incredible resources mean that for the state, developing clean energy is particularly cost-competitive. And project developers have been flocking to New Mexico in response—right now, more than 1,800 MW of wind are under construction or in advanced stages of development.

The trouble is, a number of these clean energy projects and the ones that have preceded them have been built to serve out-of-state customers, not New Mexicans. Slowly the state’s utilities have been awakening to the cost-saving potential of investing in these resources themselves. But that interest is threatened to be overshadowed by some utility calls for a much larger buildout of natural gas.

Critically, our analysis shows that a growing dependence on natural gas would be short-sighted, and not in the best interest of consumers.

Studying the horizon, and finding all signs point to renewables

We set out to understand the different electricity pathways the state could take as coal plants retire and new resources are brought online to replace them. We found that no matter how you slice it, the least-cost future is one characterized by a high level of renewables. Indeed, with or without a strengthened renewable policy in place, our research found that renewables—and not natural gas—provided the best deal for consumers and the New Mexican economy.

So why the need for a policy, when the market suggests either way leads to green?

Because these market-based findings run counter to some utility plans in the state, which propose to keep building out natural gas over time. A policy commitment to a high-renewables future, on the other hand, makes sure that these clean energy opportunities are diligently considered and pursued.

And what incredible opportunities they are.

When we modeled steadily strengthening the state’s existing renewable portfolio standard (RPS) from its current target of 20 percent by 2020 to 50 percent by 2030 and 80 percent by 2040, we found that the policy could ensure the achievement of widespread benefits for New Mexicans, including:

  • Photo: Ozturk/iStock.

    Significant capital investment, on the order of $6 billion between 2016 and 2030 and $7.2 billion between 2017 and 2040, funding the development of 2,200 megawatts (MW) of wind and 870 MW of solar by 2030, and total on-the-ground capacity reaching 3,650 MW of wind and 3,900 MW of solar in 2040.

  • Investments in wind and solar driving the creation of nearly 2,400 new direct, indirect, and induced jobs in construction, operations, maintenance and other related fields by 2030, as well as the annual potential for $9.5 million in land-lease payments by that time.
  • The affordability of electricity costs for consumers, with typical monthly electric bills for households in most years lower than they were in 2016.
  • Cleaner air leading to improved health—savings from the reduction in SO2 and NOx health effects alone could total approximately $305 million by 2030—and reduced water consumption on the order of 90 percent from coal plant retirements.

It’s clear that when the state commits to a clean energy future, the benefits and opportunities are significant, and long-lasting.

Good policy is needed to point the way

In New Mexico, when it comes to strengthening the state’s existing RPS, the goal is not to pick winners—it’s to ensure that winners will be picked. It’s also about defending against the alternative, where a growing dependence on natural gas risks saddling ratepayers long into the future with the costs of stranded assets, or infrastructure that would be abandoned before it had been paid off due to the country’s inevitable shift away from fossil fuels.

Last legislative session, SB 312 was introduced to strengthen the RPS, as modeled in this analysis. The effort ultimately stalled, but it’s expected to be revisited in future sessions. Policymakers would do well to take the time between to strongly consider how such a policy can leverage the investment benefits of regulatory certainty, and how that can help keep utilities pushing forward with clean energy progress.

Photo: Mr.TinDC/Creative Commons (Flickr).

At the same time, achieving a clean energy future in New Mexico requires more than any single policy can deliver. For example, the simultaneous strengthening of the state’s energy efficiency resource standard would bring down costs across the board.

An increased focus on demand-side solutions, such as broader implementation of time-varying electricity rates and targeted guidance to shift loads like through the electrification of hot water heaters, can similarly ease the integration of high levels of renewables.

So too can energy storage, as well as a proactive planning process to ensure that necessary transmission expansions are supported. Participation in broader energy markets can help balance loads, and save customers money. Finally, focused attention must be devoted to worker retraining, and developing viable and vibrant economic futures for communities currently dependent on coal.

Opportunity awaits. Policymakers have the chance to be proactive and actualize that positive potential now, and they should—not just for the benefit of New Mexicans today, but also for decades to come.

USDA Reorganization Sidelines Dietary Guidelines

Photo: Cristie Guevara/public domain (BY CC0)

Last month, Secretary of Agriculture Sonny Perdue announced a number of proposed changes to the organization of the vast federal department he oversees. With its 29 agencies and offices and nearly 100,000 employees, the US Department of Agriculture (USDA) is charged with a wide-ranging mission, from helping farmers to be profitable and environmentally sustainable to ensuring the nutritional well-being of all Americans. And while some of the organizational changes Secretary Perdue is pursuing (which all stem from a March executive order from President Trump) may seem arcane, they will have real impacts on all of us. The proposed merger of two key nutrition programs is a case in point.

Photo: US Department of Agriculture/Public domain (Flickr)

The plan involves relocating the USDA’s Center for Nutrition Policy and Promotion (CNPP) into the department’s Food and Nutrition Services (FNS). While FNS is well-known in anti-hunger and agricultural communities for its role in administering nutrition assistance programs, including the Supplemental Nutrition Assistance Program (SNAP), CNPP is less so—though not for lack of impact or importance.

Established in 1994, CNPP is the agency responsible for reviewing and compiling the best available scientific literature on human nutrition, developing measures of dietary quality such as the Healthy Eating Index, and (jointly with the Department of Health and Human Services) issuing the Dietary Guidelines for Americans, the cornerstone of federal nutrition policy and dietary guidance. At a time when more than 117 million Americans—half of all adults—are living with one or more preventable, diet-related chronic diseases, the role that CNPP plays in protecting public health has never been more critical.

Reorganization compromises health without achieving efficiency

In the words of Perdue himself, the proposed reorganizations are aimed at making the USDA “the most effective, most efficient, and best managed department in the federal government.”

To be clear, reorganization (or “realignment”) is not an inherently bad thing. Proposals that could successfully increase the effectiveness and accountability of government agencies without compromising mission or purpose would be laudable. But merging CNPP into FNS accomplishes neither—and follows a dangerous pattern of this administration pushing back on science with its policy agenda. Furthermore, the merger poses serious threats to the scientific integrity of the agency charged with developing evidence-based dietary guidelines for the entire country, for several key reasons:

  1. FNS and CNPP serve distinctly different purposes. FNS administers 15 food and nutrition programs targeting distinct populations, serving only a fraction of Americans. CNPP develops science-based recommendations designed to identify nutritional deficiencies and address dietary needs at a population level, which are then applied to dozens of programs across the federal government. To conflate the distinct purposes of each agency would be to detract from the efficiency of each.
  2. The CNPP administrator will lack appropriate credentials to oversee the development of evidence-based national nutrition guidelines. Following the reorganization, CNPP would no longer be headed by a politically-appointed administrator, but instead by a career associate administrator. This individual is highly unlikely to possess the education and level of expertise required by this position.
  3. Merging CNPP into FNS introduces a conflict of interest. Nutrition programs administered by FNS must adhere to dietary recommendations established by CNPP, introducing a potential conflict of interest. Without clear separation between CNPP and FNS, undue influence on the former by the latter—or even the perception thereof—would present a threat to the integrity of evidence-based recommendations.

The USDA received public comments on this issue between September 12 and October 10. The full comment authored by the UCS Food and Environment Program, outlining the risks to scientific integrity and population health posed by the proposed reorganization, follows.

UCS Comments on USDA Notice, “Improving Customer Service”

October 10, 2017

Dear Secretary Perdue and Acting Deputy Assistant Secretary Bice,

On behalf of the Union of Concerned Scientists (UCS), we are compelled to respond to the United States Department of Agriculture (USDA) notice, “Improving Customer Service,” with concerns regarding the proposed merging of the Center for Nutrition and Policy Promotion (CNPP) into the Food and Nutrition Services (FNS). This proposed action would threaten the scientific integrity of CNPP and compromise public health, while providing zero demonstrable financial or public benefit.

UCS, a science-based nonprofit working for a healthy environment and a safer world, combines independent scientific research and citizen action to develop innovative, practical solutions and secure responsible changes in government policy, corporate practices, and consumer choices. The Food and Environment Program at UCS makes evidence-based policy recommendations to shift our nation’s food and agriculture system to produce healthier, more sustainable and just outcomes for all Americans.

CNPP evidence-based recommendations play a critical role in protecting population health.
The current state of US population health poses enormous costs both to quality of life and health care systems. More than 117 million Americans—half of all adults—are now living with one or more preventable, diet-related chronic diseases, including cardiovascular disease, hypertension, diabetes, overweight/obesity, and certain types of cancer. Recent research shows that dietary factors may now play a role in nearly half of all deaths resulting from heart disease, stroke, and type 2 diabetes. In 2012, the direct medical expenses and lost productivity due to cardiovascular disease alone averaged $316 billion, while those due to diagnosed diabetes totaled $245 billion. In total, chronic diseases account for approximately 86 percent of all US health care expenditures.

However, just as diet is a key factor driving these trends, it also offers great potential to reverse them. The federal government has a critical role to play in promoting health and reducing the burden of chronic disease by supporting evidence-based policies and programs that improve the dietary patterns of Americans. For more than twenty years, CNPP has filled this role. The Nutrition Evidence Library (NEL) at CNPP applies rigorous scientific standards to conduct systematic reviews of current nutrition research, and informs a range of federal nutrition programs, including the National School Breakfast Program, National School Lunch Program, Special Supplemental Nutrition Program for Women, Infants and Children (WIC), and the Supplemental Nutrition Assistance Program (SNAP). Working jointly with the Department of Health and Human Services (DHHS), CNPP is also responsible for overseeing the development of the Dietary Guidelines for Americans, the nutrition recommendations that are a cornerstone of federal nutrition policy and dietary guidance. As an autonomous agency, CNPP is well positioned to deliver unbiased and scientifically sound recommendations to other federal agencies and to the general public.

The proposed merger is unlikely to result in increased efficiency.
As stated in USDA-2017-05399, Executive Order 13781, “Comprehensive Plan for Reorganizing the Executive Branch,” was intended to improve efficiency, effectiveness, and accountability through agency reorganization. However, there is no duplication of function between CNPP and FNS. FNS administers 15 food and nutrition programs targeting distinct populations, serving only a fraction of Americans. CNPP develops science-based recommendations designed to identify nutritional deficiencies and address dietary needs at a population level, which are then applied to dozens of programs across the federal government. To conflate the distinct purposes of each agency would be to detract from the efficiency of each. Changes in allocation of resources from restructuring would also threaten the ability of CNPP to conduct its mission.

The proposed merger threatens the scientific integrity of CNPP, compromising its core function.
Merging CNPP into FNS will weaken the ability of the USDA to provide the most current evidence-based nutrition guidance to federal food and nutrition programs. The change would also jeopardize the ability of CNPP to comply with Congressional mandates, chiefly the National Nutrition Monitoring and Related Research Act of 1990, which requires the establishment of dietary guidelines at least once every five years and the promotion of these guidelines by any federal agency carrying out a federal food, nutrition, or health program.

The proposed reorganization would degrade the scientific integrity and core function of CNPP, particularly if:

  1. The CNPP administrator lacks appropriate credentials to guide the development of science based recommendations, including the Dietary Guidelines for Americans (DGA).
    The CNPP administrator has previously been appointed by the Food, Nutrition, and Consumer Services program. With the proposed reorganization, this position would be filled by a career official lacking necessary technical expertise. In its recent review of the DGA process, the National Academy of Sciences, Engineering, and Medicine (NAS) stated that it is of critical importance that “the DGA be viewed as valid, evidence-based, and free of bias or conflict of interest.” As the individual responsible for overseeing management of the NEL and development of DGAs and other science-based recommendations, the CNPP administrator must have strong credentials, including a background in dietetics, nutrition, medicine, and/or public health, with demonstrated experience relevant to nutrition science/research, population health, chronic disease prevention, epidemiology, economics, surveillance systems, and nutrition communications and marketing. This individual must also possess experience in advanced management and budget oversight; continuous quality improvement; program planning; implementation and evaluation; data analytics; information technology; and public policy.
  2. There is inadequate separation of agency function, diminishing the autonomy of CNPP.
    The application of dietary recommendations in programs administered by FNS introduces a potential conflict of interest. Without clear separation between CNPP and FNS, undue influence on the former by the latter—or even the perception thereof—would present a threat to the integrity of evidence-based recommendations. The development of the DGAs and the USDA Food Plans (e.g. Thrifty Food Plan) are of particular concern, as they inform programs administered by FNS.

The Union of Concerned Scientists appreciates the USDA’s efforts to increase the effectiveness and accountability of government agencies. However, the merging of CNPP into FNS accomplishes neither. The ability of CNPP to effectively and independently fulfill its mission of developing evidence-based dietary guidelines without undue influence may be compromised by: 1) the replacement of an appointed administrator with a career associated administrator who may not possess the qualifications needed to oversee the development of science-based federal nutrition recommendations; and 2) the inherent conflict of interest that occurs by way of FNS oversight over CNPP, as the latter develops guidelines that the former must adhere to in the implementation of various nutrition programs.

Given the alarming trajectory of diet and disease in the US, it is in the best interests of the public and the US healthcare system that CNPP continues to operate independently from FNS to produce evidence-based recommendations for population health. As the Director of the Office of Management and Budget considers proposed agency reorganizations to meet the directive of Executive Order 13781, “Improving Customer Service,” UCS is hopeful that the Director recognizes the magnitude of the potential risks associated with merging these agencies and rejects the proposed action.

EPA Administrator Scott Pruitt Accelerates Politicization of Agency’s Science Advisory Board

Earlier today, EPA Administrator Scott Pruitt strongly suggested that the agency will not consider any candidate for EPA’s science advisory committees who has received a grant from agency. Such a gobsmackingly boneheaded move would further hamstring the ability of the EPA to accomplish its public health mission. The administrator is directly challenging the intent of Congress, which established the Science Advisory Board to provide independent scientific advice so that EPA can effectively protect our health and environment.

So why now? Administrator Pruitt’s schedule offers some clues. House Science Committee Chairman and serial scientist harasser Lamar Smith is a champion of the EPA Science Advisory Board Reform Act, flawed legislation that would increase industry control over the Science Advisory Board and, yes, prevent EPA grant recipients from serving. UCS’s Yogin Kothari summed up the legislation for the New Republic:

“They’re basically saying that people who are experts in environmental science, who have spent their careers working on this and may have received EPA grants to do their work, are inherently conflicted, whereas people who are working in the industry, who would be impacted by the board’s advice, are not conflicted,” Kothari said. “I mean, that’s bananas, right?”

Congress has for years failed to pass this legislation, which was vehemently and repeatedly opposed by UCS and many other mainstream science organizations. So in April 2017, a presumably frustrated Chairman Smith got together with Administrator Pruitt to talk about the bill. Pretty persuasion from the congressman seems to have worked.

EPA Administrator Scott Pruitt is taking more steps to politicize the EPA Science Advisory Board, defying Congress in the process. Photo: Wikimedia

Now keep in mind, Administrator Pruitt already has a parade of lobbyists and advisors providing him with the perspectives from oil, gas, and chemical companies. He already thinks he has all the right friends, but would be best served to hear from independent experts, too.

The Science Advisory Board, for now, can be a check on political influence and can help the agency determine whether the special interests are telling it straight. I can see why a man of his outlook would want to neutralize it.

There are plenty of extremely well-qualified, universally respected candidates who can provide scientific advice to an administrator who really needs it. Getting science advice from the EPA Science Advisory Board is like getting basketball tips from 40 Steph Currys. It’s the best in the business, volunteering their time in service of the public good.

Industry participation on the Science Advisory Board is not a problem. But candidates should be evaluated on their scientific expertise and ability to remain objective. So let’s recap: according to some, scientists who receive money from oil and chemical companies are perfectly qualified to provide the EPA with independent science advice, while those who receive federal grants are not. It’s a fundamental misrepresentation of how conflicts of interest work.

Soon, we’ll hear who the EPA will appoint after controversially dismissing qualified scientists earlier this year. Will the new appointees be Yes Men or responsible researchers? All signs point to the former.

If Administrator Pruitt continues to politicize the Science Advisory Board, he’ll be willfully setting himself up to fail at the job of protecting public health and the environment. It’s not something we should stand for.

How Science Can Help Us Better Prepare for Wildfires: Insights from a NASA Scientist

Satellite view of California's wildfires and smoke. NASA's Aqua satellite collected this natural-color image on October 09, 2017. Photo: NASA

In the midst of the catastrophic wildfires of Northern California that have claimed 41 lives and either destroyed or damaged more than 5,700 buildings, I wanted to know where the cutting edge of science on this issue is today. What made the California wildfires so strong and unusually destructive? Regardless of what started the fires, what conditions allowed the fires to spread so quickly? Did climate change have anything to do with it? What are scientists currently working on that can help communities better prepare for wildfires?

I had the good luck to work through my questions with NASA Earth scientist and fire expert, Dr. Douglas Morton. I reached Dr. Morton by phone while he was attending SilviLaser 2017 – a conference that brings together scientists who use a powerful laser technology called LIDAR (Light Detection and Ranging) to develop extremely high-resolution 3D maps of forests.

How has the amount of land burned by wildfires changed over time?

When it comes to wildfires, Dr. Morton let me know that NASA has been using satellites to measure changes in wildfire over time, globally and regionally. What they found is that globally, the amount of land burned by wildfires has declined over the past 18 years. This is because more and more land is being converted from natural landscapes, where fires naturally occur, to agriculture.

The Western U.S. is an exception to this – wildfires have increased in this region over time. In California alone, more than one million acres of land have been affected by wildfires this year, putting this year on track to be one of the most destructive on record in the state. More on what is causing those increases in wildfires later.

What made the California wildfires so strong?

When considering what fueled the California wildfires, Dr. Morton pointed to heat as a key factor, with these fires coming on the heels of an unprecedented heat wave in the region. In addition, high winds and drought conditions allowed the wildfires to strengthen and spread quickly. The hot, dry winds (known as Diablo winds) reached up to 79 mph.

Do we know whether climate change contributed to the California wildfires?

Attributing individual disasters to climate change is a complex and growing field. For an event like the wildfires of Northern California, Dr. Morton noted that what we can say at the moment is that the high temperatures in the area – one ingredient of wildfires – have been anomalous.

Beyond the Northern California wildfires, several studies show a link between the previously mentioned increase in wildfire activity in the Western United States and climate change as a result of increasingly warm and dry conditions in the region.  Looking forward, climate change is likely to increase the frequency of large, intense forests fires in the West.

What ability do scientists have to predict wildfires?

NASA scientists are using their understanding of the Earth system to figure out which places are likely to have wildfires as far as 3-6 months out from an event. Dr. Morton noted that some places are more predictable than others. For example, as a result of all of the hurricanes in the North Atlantic, Dr. Morton shared that there will likely be more fire activity in the Amazon next year.

How does that work? Hurricanes are a way that the Earth transfers heat away from the Equator. Warmer ocean waters are tugging a belt of rainfall known as the “Intertropical Convergence Zone” northward.  Warmer water fuels hurricanes, and these storms pull moisture with them.  As a result, there is less water available to the Amazon, and these drier conditions lead to more fire activity following years with more hurricanes and tropical storms in the Atlantic.

However, not all places are conducive to such seasonal forecasts. Dr. Morton described how seasonal wildfire forecasts are currently possible in places (like the Amazon) where fire activity is affected by temperatures at the top of the ocean (also known as sea surface temperatures).

Dr. Morton explained that Northern California’s wildfire activity is not linked to such sea surface temperatures. Instead, it is a place that responds to smaller-scale and shorter-term factors. For example, a dry pocket of air might be enough to tip Northern California into a high wildfire risk situation.

What are the new frontiers for wildfire research?

When wrapping up our conversation, Dr. Morton pointed out that scientists like himself are trying to improve wildfire forecasts, and get forecasts down to shorter timescales that decision makers can use – that is where the cutting edge is right now. He noted that there is a conference at Columbia University coming down the pike where scientists will gather to share knowledge and exchange ideas on this very topic.

We may often think of NASA as the part of the government that sends rockets into space. However, NASA’s scientists and Earth observations are vital to helping make Americans safer here on Earth. Scientists like Doug Morton are pushing the envelope so that decision-makers at the federal, state, and community-level can ultimately have more accurate wildfire forecasts, more time to prepare, and a better chance of protecting lives, property, and ecosystems.

 

Climate Change Goes to Court

Alexis de Tocqueville wrote in his opus Democracy in America, “Scarcely any political question arises in the United States that is not resolved, sooner or later, in a judicial question.” De Tocqueville made the observation in 1835 but it remains equally true in modern times, as federal and state courts have taken the lead to resolve many vexing political problems, such as dismantling segregation-era laws and ensuring a rough equality of funding for public schools. Courts are particularly prone to step in when legislative and executive branches fail to act and that void creates a crisis.

In the United States today, the federal government has abdicated leadership on the central challenge of our time—global climate change. Congress has failed to enact a national climate change policy, and seems more divided on the issue than ever. The Trump administration refuses to even acknowledge the overwhelming scientific consensus that the planet is warming due to fossil fuel combustion. It has pledged to withdraw the United States from the Paris climate agreement, is now actively promoting the expansion of domestic fossil fuel production, and is working to roll back the modest steps the Obama Administration took to address the problem. Meanwhile, the costs of inaction mount and are increasingly obvious—witness Hurricanes Harvey and Irma.

Given this dereliction of duty, will courts now step in to fill this void?  Five recently filed suits—and some new work by climate scientists—suggest the answer could be yes.

Seeking redress for climate damages

The cities of San Francisco and Oakland recently filed suits against five major fossil fuel producers: ExxonMobil, Chevron, BP, Shell, and Conoco Phillips. These two suits add upon suits filed in July by two California counties—San Mateo and Marin—and the City of Imperial Beach, which targeted 37 oil, gas and coal companies. These public plaintiffs, relying on cutting-edge work by the Union of Concerned ScientistsInside Climate News, and the Columbia School of Journalism/Los Angeles Times, tell a compelling story that fossil fuel companies have known for roughly 50 years that fossil fuel products are endangering the climate, yet many engaged in a concerted effort to conceal the dangers, sow doubt about the validity of emerging climate science and fight sensible policies to address the problem, all to increase the sales of their products. These plaintiffs also ruefully note that fossil fuel combustion has doubled since 1998, when James Hanson testified to congress about the danger of global warming, and the UN established the intergovernmental panel on climate change (IPCC), making it all but impossible to now stave off serious climate change impacts.

The plaintiffs in these cases allege that climate change impacts, such as sea level rise, will harm the infrastructure that these public authorities own and operate and public properties such as beaches and parks, and threaten to displace their residents and damage their properties. Plaintiffs seek damages for the costs they have already incurred, and those that they will incur in the future to address these threats. Several of the suits also seek disgorgement of the profits from fossil fuel production, and punitive damages to punish the alleged wrongdoing.

“Tort” lawsuits like this one have been tried before without success. In one case, a group of state attorneys general sued power plants to enjoin their emissions of carbon dioxide, and in another a group of native communities in Alaska sued fossil fuel companies for damages arising from sea level rise. Two federal courts (the US Supreme Court and the Ninth Circuit Court of Appeals) dismissed these suits on the grounds that Congress placed authority for addressing climate change upon the EPA when it enacted the Clean Air Act, and there was no room for the federal courts to act upon the issue.

There are two key differences between this suit and those prior cases. First, the plaintiffs in this case are suing in state court, and alleging that the fossil fuel companies have violated state law. Thus, the question of whether federal authority to regulate climate change rests with EPA or the judiciary is not strictly relevant.

Second, the federal courts dismissed these two prior cases in 2011 and 2012, at a time when the president and federal agencies were exercising their legal authority to address climate change.  Now, the president and the EPA are abdicating their authority and rolling back the policies that were put in place, in order to promote fossil fuels. Thus, it seems unlikely that the California courts would rely upon these two federal decisions to dismiss the cases.

However, the fossil fuel companies will surely raise this and many other defenses. They will also claim that suits of this kind are beyond the competence of the court to manage, that the plaintiffs are singling them out for a problem caused by many other entities, that the extraction and sale of fossil fuels was expressly allowed under the laws and brought prosperity to millions, and that they possessed no unique knowledge about climate change and were therefore under no obligation to disclose the potential harms of burning fossil fuels.

A decision on these cases is many years away, and it is far too early to predict how it will unfold.  But it is not too early to note the extent to which this case—and the ongoing investigations into ExxonMobil by the New York and Massachusetts attorneys general—are already sparking an important and timely debate about who is responsible for climate change.

Until now, the focus of responsibility has been on nation states: the Paris climate agreement, for example, is based on nations committing to cut emissions and/or compensate countries that are particularly vulnerable to its impacts. This focus on nation states has elided a perhaps more fundamental question—what is the responsibility of the companies that placed fossil fuels into commerce and profited from them? One senses, even in the early stages of this litigation, that this question will eventually be answered in the courts.

 A new role for scientific evidence

Lawyers and judges are not the only actors bringing climate change into the courtroom. Scientists are also playing a key behind-the-scenes role.

One of the most difficult questions a court will face, if climate litigation unfolds, is how to apportion liability for a harm that has millions of sources. The rapidly emerging field of climate attribution science helps answer that question.

In 2014, scientist Rick Heede published a paper that painstakingly traced the volume of oil, gas and coal that excavated and place into commerce between 1854 and 2010 by the 83 major investor and state-owned producers of oil, natural gas, coal, and 7 cement manufacturers. This research concluded that nearly two-thirds of all industrial carbon dioxide (CO2) and methane (CH4) emissions can be traced to the products of just 90 companies. This relatively small number augers well for the ability of a court to address the question of responsibility.

While this work is a starting point for apportioning liability, it needs further development to be useful in court. A key fact to account for is that including the fact that, unlike pollutants such as mercury or particulates, greenhouse gases do not cause harm directly. Rather, the emissions cause global warming, which then causes specific harms, such as heat waves and sea level rise. Thus, an additional step is needed to connect the global warming emissions associated with these companies’ products to the harms suffered by actual plaintiffs.

A newly published paper by UCS scientists Brenda Ezwurkel and Peter Frumhoff among others, attempts to close this gap. The authors created a model to translate the emissions traceable to these 90 companies into two particular impacts: global increases in temperature, and sea level rise.

The paper appropriately recognizes that, while fossil fuel combustion is the primary cause of these impacts, it is not the only cause; hence the model excludes sources of global warming other than fossil fuel combustion. Next, the model isolates the contribution of these 90 companies, by differentiating between the temperature increase and sea level rise that occurred with the emissions of the 90 companies’ products, and what would have occurred without them.

The authors find that approximately 50 percent of the increase in global temperature, and approximately a third of sea level rise can be traced to these 90 companies. Using the same methodology, the authors find that the top 20 investor-owned companies products caused approximately 25 percent of temperature increase, and 13-16 percent of sea level rise.

This model and methodology has a clear potential application to lawsuits, such as the ones filed by the Californian counties. For example, the County of San Mateo alleges that its waterfront property and infrastructure will be damaged by sea level rise. The County claims that that it has incurred millions of dollars to prepare for this sea level rise, expects to incur over $900 million to maintain and repair infrastructure repairs on its ocean coastline, and faces the prospect of serious or permanent inundation of property valued at $23 billion.

Assuming, for the sake of the argument, that $24 billion worth of damages can be proven, a court could use this model to find that the top twenty fossil fuel companies are responsible for 13-16 percent of this damage, or roughly $360 million. Such a model could be adapted for many other applications, including attributing a portion of responsibility for major weather disasters, such as Tropical Storm Harvey’s severe flooding of Texas.

Lawsuits are successful when the litigants remove the underbrush to point courts to a clear and manageable way to answer the questions a case poses.  The work of these climate attribution scientists goes a long way towards answering the question of how to apportion liability, in the event it is determined that the fossil fuel companies have a legal responsibility for climate change.

A lawsuit on behalf of the next generation

These plaintiffs also allege that the government has violated the so-called “public trust doctrine”—an ancient principle originating in Roman law that a sovereign government must hold common natural resources in trust and must preserve them for future generations. Plaintiffs seek a court order requiring the government to implement an enforceable national plan to phase out fossil fuel emissions to stabilize the climate system.

Not surprisingly, the US government has vigorously fought back—and so, for a short while, did the fossil fuel industry through its trade associations, which joined the lawsuit as defendants but later exited when the case opened them up to the prospect of handing over internal documents).  The defendants claimed that the suit should be dismissed because it raises a “political question” only the executive and legislative branches can resolve; that plaintiffs lack legal standing because climate change does not affect them in any way different than the general public; and that the government has no legally enforceable duty under either the constitution or the public trust doctrine to ensure a stable climate.

Many legal observers, including me, thought this case was a long shot. But, in 2016, a federal district court rejected all of these defenses and ordered a trial to begin in February 2018.  In a breathtaking opinion, the court ruled that plaintiffs had alleged specific harm that gave them legal standing to bring the case and that there may well be a constitutional right to a stable climate system. Furthermore, the court ruled that the public doctrine does potentially require the government to at least act as a responsible steward of the oceans, if not the Earth’s atmosphere, and that the judicial branch does not have to sit on the sidelines in deference to the legislative and executive branches.

If one were looking for a stunning modern example of DeToqueville’s observation of how political disputes eventually enter the courtroom, one needn’t look beyond this opinion.   The suit exemplifies one of the hallmarks of our legal system–its ability to evolve in incremental ways to meet the perceived necessities of the time.  While neither the US constitution nor the public trust doctrine were developed to deal with climate change, here the court found that both could be adapted to address this problem.

Ironically, the plaintiff’s chances of prevailing are probably improved by the election. Had the defendants still been the Obama administration, or a successor Clinton administration, they would have argued that the government had acknowledged the harm of climate change and was doing what it could to address it within the confines of its legal authority.  This defense would put the court in the unenviable position of second-guessing the technical and policy expertise of agencies like the EPA and the DOE, a role that courts typically try to avoid.  The Trump administration, however, cannot make this argument given its decision to withdraw from the Paris climate agreement, erase (literally) the Obama climate action plan, and begin the rollback of key regulations such as the Clean Power Plan and fuel economy standards for cars.

Nevertheless, there is a long way to go on this lawsuit. The Trump administration has attempted to short circuit the trial scheduled for February by filing a premature appeal, and that request is pending. Even if that gambit fails, at trial, the plaintiffs will be required to prove allegations that the court assumed to be true at this early stage in the litigation. And if the court does find for the plaintiffs, the court will have to grapple with the difficult issue of a remedy.

The plaintiffs seek a court order requiring the US government to ensure a stable climate system but climate change, of course, is a global problem caused by many foreign states and companies, and the court has no authority over any of them. Thus, the court will have to assign some proportional percentage of responsibility to the United States government. And, even if such a standard were established, the court would then need to be prepared to oversee the process by which the United States government met it.

The future of climate litigation

It is too early to gauge the impact of the climate lawsuits now underway. But these cases are already putting the fossil fuel industry  and the Trump administration on the defensive, raising an overdue debate about the legal responsibility of the fossil fuel industry, and likely heightening investor unease with fossil fuel investments.

It is well to remember that “impact” litigation is often successful even if there is never a final court order or jury verdict, and that it often takes years or even decades for success to emerge. If these cases survive early motions to dismiss them, and if similar cases are filed in other jurisdictions, these actions could conceivably create leverage for an overall settlement in which fossil fuel companies cease all climate science denial, make explicit and enforceable commitments to phase out fossil fuels over time and/or equip power plants with carbon collection mechanisms, and actively support a carbon tax, the proceeds of which can be used in part to fund preparedness and resilience for the most vulnerable communities.

So, at this moment, when the federal government has turned its back and abdicated its responsibility, we should take some encouragement from the fact that climate change lawyers and scientists are going to court. As DeToqueville pointed out long ago, this has long since been the American way.

US Withdrawal from UNESCO Will Undermine Collaboration on Science and Culture

The Trump Administration’s war on science has intensified with the announcement that the US is withdrawing from UNESCO, the international organization that works to promote peace & security through international cooperation on education, science and cultural programs. 

Founded in 1945, when nations were seeking ways to rebuild educational systems and cultural connections in the immediate aftermath of World War II, UNESCO today is a leading multi-lateral organization working on a range of issues crucial for achieving peace, equity and sustainability world-wide.

“Every day, countless Americans and American communities pour their time and their hearts into UNESCO-led international collaborations on science, on education and on culture” says Andrew Potts who practices cultural heritage law at Nixon Peabody LLP.  They work on preventing violent extremism via youth education, on literacy and educating women and girls, on science for development, and on free speech and journalist safety. And of course, they fight for cultural diversity and heritage through UNESCO projects like the World Heritage program, biosphere reserves and the Creative Cities initiative.

UNESCO recognition benefits US communities

Mission San Antonio de Valero “The Alamo”, in San Antonio, Texas. Photo: NPS

UNESCO recognition and connections can bring economic benefits to US communities. For example, according to a State Department news bulletin from August 2017, Tucson, Arizona, which was listed as a UNESCO Creative City of Gastronomy in 2015, has experienced an increase in tourism and restaurant revenues as a direct result, as well as millions of dollars of earned media coverage.

The US withdrawal announcement on October 12th came smack in the middle of Iowa City’s eight-day UNESCO City of Literature Book Festival. It also came right on the heels of San Antonio, Texas’ second World Heritage Festival, a new annual event that already attracts thousands of visitors to celebrate and learn about the San Antonio missions – including the Alamo – that were added to the UNESCO’s World Heritage list in 2015.

Although US World Heritage sites won’t lose their status when the US leaves UNESCO, there will likely be little or no federal support for collaboration and engagement with the international agency or its staff.

Relationship status: It’s complicated

The US has a complicated history with UNESCO. It helped to found the organization and has always been actively engaged, but it has also withdrawn once before.

At the height of the cold war in 1984, Ronald Reagan pulled the US out. At that time, a report on the implications for US science published by the National Research Council identified disruptions to international scientific collaborations, reduced confidence in US scientific leadership and forfeiture of the right to participate in governance of UNESCO-led scientific initiatives.

The US ultimately continued to provide an equivalent level of international financial support for science, culture and education, but the impacts of withdrawal were significant in the scientific community.

George W. Bush took the US back into UNESCO nearly 20 years later in 2002, and then in 2011, the Obama administration drastically cut back on financial support to UNESCO in response to Palestine being granted full membership.

The process for withdrawal takes some time, and the US will not formally cease to be a member of UNESCO until December 31st, 2018. The State Department has said the US remains committed to UNESCO’s important work and will seek observer status.

Secretary Tillerson could put action behind that talk by committing to put the equivalent of the US’s former UNESCO dues payments into other international collaborations in science, education and culture.

“Wars begin in the minds of men”

The opening words of UNESCO’s constitution carved in 10 languages in Toleration Square, Paris. Photo: UNESCO.

Meanwhile, it is the words of the American poet Archibald MacLeish that are enshrined in UNESCO’s constitution and etched in 10 languages on the Tolerance Square wall at the organization’s headquarters in Paris: “Since wars begin in the minds of men, it is in the minds of men that the defences of peace must be constructed.”

According to outgoing UNESCO Director-General Irina Bokova, “[that] vision has never been more relevant” than it is today. In a moving and very personal statement in response to the news of the US withdrawal, Bokova said,

At the time when the fight against violent extremism calls for renewed investment in education, in dialogue among cultures to prevent hatred, it is deeply regrettable that the United States should withdraw from the United Nations agency leading these issues.

Under Bokova’s leadership, with major involvement from the US, UNESCO has been at the forefront of efforts to protect heritage sites and museum collections in Iraq and Syria as ISIS forces have tried to destroy monuments and stamp out culture.

She has also spearheaded implementation of the United Nations Plan of Action on the Safety of Journalists and the Issue of Impunity, in a world where journalists’ freedom to work and safety is constantly under threat.

Through its Science for Sustainable Development program which many US universities participate in, UNESCO has launched important initiatives to increase the number of women in science, to ensure science is at the heart of policy-making for sustainable development, to fully value Traditional Ecological Knowledge and to champion open access to scientific information.

Protecting World Heritage

UCS led the team that produced UNESCO’s 2016 report on climate change and World Heritage. Photo: UNESCO.

It is probably for its work on World Heritage that UNESCO is best known to most Americans. The World Heritage Convention was set up to help protect for future generations, natural and cultural heritage deemed to be of universal value for humankind.

There are 23 World Heritage sites in the US, amongst them, the Statue of Liberty, Independence Hall in Philadelphia and Yellowstone, Yosemite and Mesa Verde national parks. Many of America’s World Heritage sites and cultural sites are at risk from climate change impacts including worsening wildfires, more intense storms, sea level rise and coastal flooding.

The National Park Service, which is a global leader in researching and responding to the effects of climate change on protected areas, has historically been a major player in the World Heritage Convention under UNESCO’s leadership.

Indeed, just as the US is planning to withdraw from UNESCO, the international World Heritage Committee is preparing a major effort to step up its engagement with the implementation of the Paris Agreement (a global commitment to act to reduce global warming emissions to address climate change), and the IPCC (Intergovernmental Panel on Climate Change), and to update its policy on climate change for the first time in a decade.

UCS will be fully engaged in that process, building on the policy recommendations in our report on climate and world heritage, published with UNESCO and UNEP in 2016.

The Trump administration, however, is relegating federal scientists, experts and agencies to bystander status with one more pointlessly anti-science jab at the international community. In response Potts says,

Now more than ever, as with the Paris Agreement, it will be incumbent on US cities, universities and NGOs to pick up the reins of global education, science and cultural collaboration; to continue to make American contributions to all these critical endeavors and to make sure American communities benefit from their progress.

 

 

Pages