UCS Blog - The Equation (text only)

What is the Responsible Science Policy Coalition? Here Are Some Clues

An interesting new group has popped up called the Responsible Science Policy Coalition (RSPC) that seems to have a significant interest in chemical safety policy. Are they legitimate? As Congress prepares for another hearing into the dangers of per- and polyfluoroalkyl substances, known as PFAS, it’s worth digging into who these folks might be.

PFAS have been in the news quite a bit recently because the White House was caught censoring a report from the Agency for Toxic Substances and Disease Registry (ATSDR) on the health effects of PFAS exposure. These chemicals are widely used in products ranging from nonstick cookware to water-repellant fabrics, and they are especially prevalent in the water at military bases due to their use in firefighting foam. Bipartisan outrage about the censorship was swift and sustained, and the report was released in June. A new UCS analysis shows that many military bases have potentially unsafe levels of PFAS in drinking water.

PFAS are used in firefighting foam. Some military bases have unsafe levels of PFAS in drinking water. Photo: DVIDSHUB/Flickr

In July, the Responsible Science Policy Coalition surfaced at a meeting of the Council of Western Attorneys General where they expressed being “eager to help your state with your issues.” In their presentation to the attorneys general, the RSPC argued that there are “lots of problems with existing PFAS studies” and that these studies “don’t show the strength of association needed to support causation.”

The RSPC also submitted a comment on the ATSDR draft toxicology assessment that extensively detailed why, in their view, ATSDR’s scientific approach was sub-par.

Who is supporting the RSPC?

Where, then, did the Responsible Science Policy Coalition come from, and why do they care so much about PFAS? Here’s what we know. According to the PowerPoint presentation, RSPC is a new coalition made up of 3M, Johnson Controls, and unnamed other companies.

The inclusion of 3M is particularly notable because the company spent decades hiding the science about the dangers of PFAS. 3M used such chemicals in many highly-used products including Scotchgard and firefighting foam. According to the Intercept:

A lawsuit filed by Minnesota against 3M, the company that first developed and sold PFOS and PFOA, the two best-known PFAS compounds, has revealed that the company knew that these chemicals were accumulating in people’s blood for more than 40 years. …The company even had evidence back then of the compounds’ effects on the immune system…

The suit, which the Minnesota attorney general filed in 2010, charges that 3M polluted groundwater with PFAS compounds and “knew or should have known” that these chemicals harm human health and the environment, and “result in injury, destruction, and loss of natural resources of the State.” The complaint argues that 3M “acted with a deliberate disregard for the high risk of injury to the citizens and wildlife of Minnesota.” 3M settled the suit for $850 million in February, and the Minnesota Attorney General’s Office released a large set of documents…detailing what 3M knew about the chemicals’ harms.

The government can protect families from excessive exposure to PFAS, but only with access to independent scientific information. Photo: nicdalic/Flickr

And you thought all they made was Post-its and tape!

RSPC seems to be led by Jonathan Glendill and James Votaw. Glenhill is the president of the Policy Navigation Group, whose list of past and present clients is dominated by industry groups. Votaw is a lawyer for Keller and Heckman; the address given for RSPC is the address of Keller and Heckman’s DC offices. Votaw has signed the three comments from RSPC on the ATSDR study (the first two were extension requests). Votaw’s practice concentrates on environmental and health and safety regulation, including chemicals and pesticides.

Keller and Heckman’s chemicals practice is more circumspect, but their pesticides practice is described in part as “[helping] clients defend existing markets worldwide against governmental pressure and environmentalist activism.” They can help companies “defend against an EPA enforcement action” and “secure successful tolerance reassessments.”

How many groups like this are there?

A name like the Responsible Science Policy Coalition makes insinuations of course—that most people are pulling numbers out of thin air and pursuing haphazard or irresponsible science policy, and we really need some adults in the room. The same words are reused again and again in the names of these types of organizations, and “responsible” is no different. There’s the Citizens Alliance for Responsible Energy, the Coalition for Responsible Healthcare Reform, the Coalition for Responsible Regulation, and more.

Less charitably, groups like RSPC are known as front groups. Disguised by innocuous-sounding names and with a veneer of independence, they principally exist to create doubt and confusion about the state of the science to avoid regulation of the products their members create. Plenty of industries have them. The American Council on Science and Health has long conducted purportedly independent science that was in fact funded by corporate interests. The Groundwater Protection Council fights federal regulation of fracking. The Western States Petroleum Association, the top lobbyist for the oil industry in the western United States, was found in 2014 to be running at least sixteen different front groups in order to undermine forward-looking policies like California’s proposal to place transportation fuels under the state’s carbon cap.

Could the Responsible Science Policy Coalition meet its stated goal to “accelerate research and promote best practices and best available science in policy decisions?” Perhaps. But those looking to RSPC for advice should be wary of the fact that so far, it seems to exist to encourage more relaxed regulation of PFAS chemicals – a decision that is worth a lot of money to the organization’s key members.

When Will Autonomous Vehicles be Safe Enough? An interview with Professor Missy Cummings

Photo: Jaguar MENA

Autonomous vehicle (AV) supporters often tout safety as one of the most significant benefits of an AV-dominated transportation future. As explained in our policy brief Maximizing the Benefits of Self-Driving Vehicles:

While self-driving vehicles have the potential to reduce vehicle-related fatalities, this is not a guaranteed outcome. Vehicle computer systems must be made secure from hacking, and rigorous testing and regulatory oversight of vehicle programming are essential to ensure that self-driving vehicles protect both their occupants and those outside the vehicle.

Professor Mary “Missy” Cummings, former fighter pilot and current director of the Humans and Autonomy Lab at Duke University, is an expert on automated systems. Dr. Cummings has researched and written extensively on the interactions between humans and unmanned vehicles, regulation of AVs, and potential risks of driverless cars. I had the opportunity to speak with Dr. Cummings and ask her a few questions about current technological limitations to AV safety and how to use regulation to ensure safety for all Americans, whether they are driving, walking, or biking.

Below are some key points from the interview, as well as links to some of Dr. Cummings’ work on the topics mentioned.

Jeremy Martin (JM): Safety is one of the biggest arguments we hear for moving forward with autonomous vehicle development. The U.S. National Highway Traffic Safety Administration has tied 94% of crashes to a human choice or error, so safety seems like a good motivating factor. In reality, how far are we really off from having autonomous systems that are safer and better than human drivers? And are there specific software limitations that we need to improve before we remove humans from behind the wheel?

Dr. Mary “Missy” Cummings (MC): I think one of the fallacies in thinking about driverless cars is that, even with all of the decisions that have to be made by humans in designing software code, somehow they are going to be free from human error just because there’s not a human driving. Yes, we would all like to get the human driver out from behind the wheel, but that doesn’t completely remove humans from the equation. I have an eleven-year-old, I would like to see driverless cars in place in five years so she’s not driving. But, as an educator and as a person who works inside these systems, we’re just not there.

We are still very error prone in the development of the software. So, what I’d like to see in terms of safety is for us to develop a series of tests and certifications that make us comfortable that the cars are at least going to be somewhat safer than human drivers. If we could get a reliable 10% improvement over humans, I would be good with that. I think the real issue right now, given the nature of autonomous systems, is that we really do not know how to define safety for these vehicles yet.

JM: So you’re not optimistic about meeting your five-year target?

MC: No, but it’s not a discrete yes or no answer. The reality is that we’re going to see more and more improvement. For example, automatic emergency breaking (AEB) is great, but it’s still actually a very new technology and there are still lots of issues that need to be addressed with it. AEB will get better over time. Lane detection and the car’s ability to see what’s happening and avoid accidents, as well as feature’s like Toyota’s guardian mode, will all get better over time.

When do I think that you will be able to use your cell phone to call a car, have it pick you up, jump in the backseat and have it take you to Vegas? We’re still a good 15-20 years from that.

JM: You mentioned that if AVs performed 10% better than human drivers, that’s a good place to start. Is that setting the bar too low? How do we set that threshold and then how do we raise the bar over time?

MC: I think we need to define that as a group of stakeholders and I actually don’t think we need a static set of standards like we’re used to.

With autonomous vehicles, it’s all software and not hardware, but we don’t certify drivers’ brains cell by cell, what we do is certify you by how you perform in an agreed-upon set of tests. We need to take that metaphor and apply it to driverless cars. We need to figure out how to do outcome-based testing that is flexible enough to adapt to new coding approaches.

So, a vision test, for example, in the early days of driverless cars should be a lot more stringent, because we have seen some deaths and we know that the sensors like lidar and radar have serious limitations. But, as those get addressed, I would be open to having less stringent testing. It’s almost like graduated licensing. I think teenagers should have to go through a lot more testing than me at 50. Over time, you gain trust in a system because you see how it operates. Another issue is that now cars can do over-the-air software updates. So, do cars need to be tested when a new model comes out or when they have a new software upgrade that comes out? I don’t claim to have all the answers, and I’ll tell you that nobody does right now.

JM: One safety concern that emerges in discussions around AVs is cybersecurity. What are the cybersecurity threats we should be worried about?

MC: There are two threats to cybersecurity that I’m concerned about, one is active hacking, and that would be how somebody hacks into your system and takes it over or degrades it in some way. The other concern is in the last year, there’s been a lot of research that’s shown how the convolution neural nets that power the vision systems for these cars can be passively hacked. By that I mean, you don’t mess with the car’s system itself, you mess with the environment. You can read more about this but, for example, you can modify a stop sign in a very small way and it can trick an algorithm to see a 45 mile per hour speed limit sign instead of a stop sign. That is a whole new threat to cybersecurity that is emerging in research settings and that, to my knowledge, no one is addressing in the companies. This is why, even though I’m not usually a huge fan of regulations, in this particular case I do think we need stronger regulatory action to make sure that we, both as a society and as an industry, are addressing what we know are going to be problems.

JM: We hear a lot about level 3 and 4 automation, where a human backup driver needs to be alert and ready to take over for the car in certain situations, and after that fatal accident in Arizona we know what the consequences can be if a backup driver gets bored or distracted. What kinds of solutions are there for keeping drivers off their phones in AVs? Or are we just going to be in a lot of trouble until we get to level 5 automation and we no longer need backup drivers?

MC: I wrote a paper on boredom and autonomous systems, and I’ve come to the conclusion that it’s pretty hopeless. I say that because humans are just wired for activity in the brain. So, if we’re bored or we don’t perceive that there’s enough going on in our world, we will make ourselves busy. That’s why cellphones are so bad in cars, because they provide the stimulation that your brain desires. But even if I were to take the phones away from people, what you’ll see is that humans are terrible at vigilance. It’s almost painful for us to sit and wait for something bad to happen in the absence of any other stimuli. Almost every driver has had a case where they’ve been so wrapped up in their thoughts that they’ve missed an exit, for example. Perception is really linked to what you’re doing inside your head, so just because your eyes are on the road doesn’t mean you’re going to see everything that’s in front of you.

JM: What’s our best solution moving forward when it comes to safety regulations for autonomous vehicles? Is it just a matter of updating the standards that we currently have for human-driven vehicles or do we need a whole new regulatory framework?

What we need is an entirely new regulatory framework where an agency like NHTSA would oversee the proceedings. They would bring together stakeholders like all the manufactures of the cars, the tier one suppliers, people who are doing the coding, as well as a smattering of academics who are in touch with the latest and greatest in related technologies such as machine learning and computer vision. But we don’t just need something new for driverless cars, we also need it for drones, and even medical technology. I wrote a paper about moving forward in society with autonomous systems that have on-board reasoning. How are we going to think about certifying them in general?

The real issue here, not just with driverless cars, is that we have an administration that doesn’t like regulation, so we’re forced to work within the framework that we’ve got. Right now, NHTSA does have the authority to mandate testing and other interventions, but they’re not doing it. They don’t have any people on the staff that would understand how to set this up. There’s just a real lack of qualified artificial intelligence professionals working in and around the government. This is actually why I’m a big fan of public-private partnerships to bring these organizations together – let NHTSA kind of quarterback the situation but let the companies get in there with other experts and start solving some of these problems themselves.

 

Dr. Mary “Missy” Cummings  is a professor in the Department of Mechanical Engineering and Materials Science at Duke University, and is the director of the Humans and Autonomy Laboratory and Duke Robotics. Her research interests include human-unmanned vehicle interaction, human-autonomous system collaboration, human-systems engineering, public policy implications of unmanned vehicles, and the ethical and social impact of technology.

Professor Cummings received her B.S. in Mathematics from the US Naval Academy in 1988, her M.S. in Space Systems Engineering from the Naval Postgraduate School in 1994, and her Ph.D. in Systems Engineering from the University of Virginia in 2004.  Professor Cummings as a naval officer and military pilot from 1988-1999, she was one of the Navy’s first female fighter pilots.

Photo: Jaguar MENA

PFAS Contamination at Military Sites Reveals a Need for Urgent Science-based Protections

A new UCS factsheet released today looks at PFAS contamination at military bases, revealing that many of the sites have levels of these chemicals in their drinking or groundwater at potentially unsafe levels. PFAS, or poly- and perfluorinated alkyl substances, have been used in everything from Teflon pans, to nonstick food packaging, to water-repellent raingear for decades. Only recently has it been revealed to the general public that these compounds are seeping into our waterways and causing health issues in people who are exposed to the chemical at elevated levels over time.

The contamination of drinking water and groundwater at military bases continues to be a problem because the firefighting foam used in training exercises and in operations contains PFAS. Living with this additional risk is an unacceptable extra burden that these men and women and their families should not have to bear. This is not just a story about a chemical that is largely unregulated, it is a story about the people who are dealing with the ramifications of its widespread contamination every single day.

What we found

A draft toxicology report released by ATSDR after emails obtained by UCS revealed that the White House had been suppressing the study suggested that risk levels for PFAS were 7 to 10 times lower than the EPA’s current standards.

The report’s findings, suggesting that PFAS are potentially more hazardous than previously known, are particularly concerning because of these compounds’ persistence in the environment and widespread prevalence.

UCS mapped PFAS contamination of groundwater and drinking water at 131 active and formerly active US military sites across 37 states. We translated the ATSDR’s risk levels for PFOA and PFOS into comparable drinking water standards in parts per trillion using EPA’s own methods and found all these sites but one exceeded the more conservative of those levels.

  • At 87 of the sites—roughly two-thirds—PFAS concentrations were at least 100 times higher than the ATSDR risk level.
  • At 118 of the sites—more than 90 percent—PFAS concentrations were at least 10 times higher than the ATSDR risk level.
  • Over half of the 32 sites with direct drinking water contamination had PFAS concentrations that were at least 10 times higher than the ATSDR risk level.
Urgent action needed

In the ATSDR’s scientific review of 14 PFAS, the association between exposure and negative health effects is clear. While there is absolutely need for more research into some of these associations and a lot more data to fill the gaps on the thousands of PFAS compounds that have not yet been looked at, there is a compelling case to be made for EPA to act urgently on the class of chemicals and there are no shortage of ways to do so, both by enacting enforceable standards and providing support to the states that have taken the lead on this issue.

Responding to high rates of contamination, communities have been on the frontline of getting action in their states. More and more states are setting standards for PFAS in drinking water and groundwater more stringent that the EPA’s health advisory, and places like Washington have banned the use of PFAS in firefighting foam and food packaging. And communities and organized and poised to get the changes they want.

Congress has also been hearing from their constituents and taking action. There have been an encouraging flurry of bills introduced in both the House and Senate over the past year. An amendment to the 2018 National Defense Authorization Act secured by New Hampshire’s Sen. Jeanne Shaheen has enabled $20 million funding for ATSDR to conduct a nationwide health study on PFAS. Other measures are still pending. The House recently passed an amendment to the FAA Reauthorization Act of 2018 that would allow commercial aircraft manufacturers and commercial US airports to use non-fluorinated chemicals in firefighting foam starting in 2 years. The PFAS Registry Act (S. 2719, H.R. 5921) would direct the U.S. Department of Veterans Affairs to establish a registry to ensure that veterans possibly exposed to PFAS via firefighting foam on military installations get information about exposure and treatment. The PFAS Accountability Act (S. 3381) in the Senate and the PFAS Federal Facility Accountability Act in the House would encourage federal agencies to establish cooperative agreements with states on removal/remedial actions to address PFAS contamination from federal facilities including military installations. The PFAS Detection Act (S. 3382) would require USGS to develop a standard to detect and test for PFAS in water and soil near releases, to determine human exposure, and report data to federal/state agencies & relevant elected officials.

Tomorrow, a Senate Homeland Security & Government Affairs subcommittee is holding a hearing on the “federal role in the toxic PFAS chemical crisis” at which EPA and DOD representatives will be testifying. It is critical that these agencies provide members of the public with clear answers on how they will be doing their jobs to protect all of us from further PFAS contamination. Take action with us to hold these agencies accountable today.

Action Needed to Address the US Military’s PFAS Contamination

There was dead silence at a community meeting last week in Portsmouth, New Hampshire after Nancy Eaton spoke before a panel of top federal health officials planning a study of per- and polyfluoroalkyl (PFAS) contamination at the former Pease Air Force Base. She described how her husband David, who was healthy all his life, died quickly in 2012 at 63 from pancreatic cancer.

David Eaton served four decades in the Air National Guard based out of Pease and saw duty in Vietnam, the Persian Gulf, and Iraq. Nancy said David drank the base’s water and the coffee brewed with the same water every day, on top of being exposed to toxic chemicals as an airplane mechanic.

“He loved every second of the 40.7 years he proudly served our country,” Eaton told the panel of experts from the Agency for Toxic Substances and Disease Registry (ATSDR), a division of the Centers for Disease Control and Prevention. Unfortunately, Eaton said, “my husband and I never had the chance to retire together, take a couple of trips nor build our retirement home.”

She said she was not alone as a premature widow, noting that several of her husband’s comrades died of cancer and others have  suffered tumors of the brain, lung, mouth and breast. She concluded by saying that her husband and others served their country without asking questions, “never thinking their lives would be cut short due to carcinogens on the job. Our families deserve answers as well as preventing this from happening again.”

Eaton’s sentiments were echoed by Andrea Amico, a co-founder of Testing for Pease, a group of parents whose children drank contaminated water at the site’s daycare center, and whose demands for a thorough investigation of PFAS harms have now resulted in establishing Pease as a key location in the first nationwide federal study of those harms.

A nationwide problem

Eaton’s and Amico’s pleas for answers are part of a growing chorus across the country. The latest scientific research suggests that this group of chemicals is more harmful to human health than previously recognized. The group of chemicals known as PFAS, common in many household products such as non-stick cookware and water-repellant carpeting, are linked to several cancers, liver damage, thyroid disease, asthma and reproductive and fetal health disorders. A report from the Environmental Working Group says as many as 110 million Americans may be drinking PFAS-laced water. Nowhere, however, is the problem as acute as it appears to be on US military sites where PFAS compounds have been heavily used for training in fire suppression and the chemicals have been routinely allowed to drain into groundwater.

According to a new report and interactive map from the Union of Concerned Scientists (UCS), the levels at Pease, a Superfund site, are 43,545 times above what the ATSDR considers safe, and some 30,000 people live within three miles of the base. Some 9,500 employees work in the 250 businesses in the current trade port that was once part of the base itself. The military has shut down the worst polluted drinking water well but residents remain concerned about the groundwater contamination. An Air Force official told the Portsmouth Herald in July that it could take up to a decade to resolve the issue even with state-of-the-art water filtration.

Worse yet, Pease is just one of more than 100 military sites with similar problems. The UCS study looked at 131 military sites and found that all but one had levels in excess of what the government now considers safe. The vast majority–some 87 bases–reported PFAS concentrations more than 100 times safe levels and 10 of those military sites had concentrations 100,000 to 1 million times higher than the government’s recommended “safe” levels.

Trump Administration tried to suppress new findings

Concern has mounted over the past year about the danger posed by the PFAS group of chemicals. In May, UCS published emails obtained under the Freedom of Information Act that indicated that the Trump administration was suppressing a study reviewing the link between PFAS and disease in humans.

In a now-infamous January 30 email discussing the decision to delay publication of the PFAS report, Office of Management and Budget Associate Director James Herz relayed the concern of an aide in the White House Office of Intergovernmental Affairs that: “The public, media, and Congressional reaction to these numbers is going to be huge. The impact to EPA and [the Defense Department] is going to be extremely painful.” Herz fretted about “the potential public relations nightmare” it would be for the report to be released.

Thanks to pressure brought after the emails were exposed, the 850-page ATSDR study on the toxicity and prevalence of 14 PFAS compounds was released in June. The study sets new recommendations for safe exposure to PFAS compounds 7-to-10 times lower than currently recommended by the EPA. Notably, though, this group of chemicals has so far managed to escape any enforceable limits because EPA has never officially listed them on its registry of toxic chemicals.

It is nothing short of outrageous that worries of a public relations headache almost won out over the actual public health nightmare that millions of Americans face greater risks from exposure to PFAS than previously recognized. The entire class of PFAS should be immediately registered on EPA’s list of toxic pollutants and the Defense Department should ask Congress for enough resources to clean up the contamination.

Pressure mounts for action

Andrea Amico from Pease, age 36, said she worries every day if and when health effects might show up in her husband, who drank Pease water at work for nine years and whose first two children drank the water in day care. She said close to 100 people have contacted her, worried that their serious illnesses are related to PFAS exposure. “We’ve heard from women having problems with fertility to the point where one woman told me she worked in an office where all the women had fertility problems,” she said.

Similar concerns are being echoed all around the country. This week, Amico is scheduled to testify before a Senate subcommittee alongside Arnie Leriche, a former EPA environmental engineer who lives near the former Wurtsmith Air Force Base in northern Michigan. PFAS compounds there were recorded at 73,636 times the level considered safe by ATSDR’s new recommendations.

Asked how it feels to be to be advocating for answers on pollution after investigating it at EPA for nearly four decades, Leriche said he felt “a lot of disappointment that the agency hasn’t been able to conduct the mission the way it should,” both amid the Trump administration’s current attempt to gut clean air and water rules of the 1970s, and the inconsistent focus and inadequate EPA funding over the years by both Democratic and Republican administrations. “But there’s a lot of career employees at EPA, engineers, Forest Service who are sticking their neck out, not going to let this happen quietly.”

Michigan, with its heavy industrial history, has a long, troubled relationship with PFAS. Clean water advocates were staggered this summer when the MLive Media Group, which represents many newspapers in mid-sized cities in the state, discovered through a Freedom of Information Act request that the state Department of Environmental Quality sat for nearly six years on a report warning of potential widespread PFAS contamination. It is the same DEQ that sat on the Flint Water Crisis.

The report, prepared by state environmental specialist Robert Delaney, found perfluoroalkyl levels in fish “on an order of magnitude higher than anything documented in the literature to date.” With contamination evident throughout the food web, from algae and zebra mussels to mink and bald eagles, the report indicated Michigan was suffering from “widespread contamination,” with little monitoring and “an endless list of things that could and possibly should be done. However, first, those in authority have to be convinced that there is a crisis.”

One of those trying to convince the state to care about PFAS, despite its horrific neglect of Flint, is Cody Angell. He has been a lead advocate for testing and remediation of industrial PFAS in communities north of Grand Rapids. Like the residents at Pease, Angell says Michigan has many anecdotal cases of cancer and residents want answers and action to address the problem.

“Every day, more and more people are realizing that government has failed us,” Angell said. “When you hear that government is more concerned about public relations than people, that is really like saying they are knowingly poisoning people. But as long as we can keep PFAS in the news, we’ll get results at some point.”

Back at Pease, keeping PFAS in the news is precisely Amico’s goal as well. At the meeting, she read a letter from Doris Brock, widow of Kendall Brock, who served 35 years at Pease and died last year of bladder and prostate cancer at the age 67. Doris Brock said she keeps a list of 70 members of military families she knows were hit with organ cancers and 40 are now dead.

“We don’t want just studies,” Amico said. “We want medical monitoring and physicians have to know what these chemicals are so people can be treated properly.”

Ken Lauter, 69, who worked at Pease for 24 years in military and commercial aircraft maintenance and security, told the community meeting that he too has been battling cancer. His lymphoma was discovered when he went to the doctor for what he thought was a pinched nerve that limited use of his left arm.

He said that in the 1990s, people’s suspicions abounded about the water as taped signs with skulls and crossbones appeared above water fountains. Once, when an air tanker exploded, he said, he and other responders waded knee deep in fire fighting foam, a key source of PFAS.

“We were in (chemicals) up to our neck doing our jobs every day . . . I did my job. So did these other guys and this is the price we pay. Investigate. Please check it out for these people,” Lauter said.

It’s long past time for the government to act.

Puerto Rico: Maria’s Laboratory for Scientific Collaboration

CienciaPR’s education specialist, Elvin Estrada, trains educators at the Boys and Girls Club of Puerto Rico on how to use the Foldscope, a low-cost paper microscope, as part of CienciaPR’s Science in Service of Puerto Rico initiative. Each of the 500 students participating in the project will receive the instrument free of charge to observe the biological diversity in a terrestrial ecosystem that was impacted by Hurricane Maria. Photo courtesy of Mónica Feliú-Mójer.

Reposted with permission by STEM + Culture Chronicle, a publication of SACNAS – Advancing Chicanos/Hispanics & Native Americans in Science

When Hurricane Maria hit Puerto Rico on September 20, 2017, Ubaldo Córdova-Figueroa’s primary concern was for the safety of his students and research assistants. With communications shut down, it took over a month for the professor of chemical engineering at the University of Puerto Rico–Mayagüez to contact them all. “Having no access to my students or my research-lab members was very painful because I didn’t know what was going on with them. I just wanted to know that they were fine,” he says. Everyone was okay but became anxious when research was interrupted for months. Córdova-Figueroa had to reassure them that it was okay, to relax, and wait for things to return to normal. It was, after all, a catastrophe.

Córdova-Figueroa says many scientists are concerned about their future in research at the university, which was facing a fiscal cliff before the hurricane. “They are afraid that they may not get the support they need to recover,” he says. But consensus is building that devastation from last year’s hurricanes could change the way science is approached in Puerto Rico. The post-hurricane conditions provide a unique environment to study. There is also an opportunity to develop local, non-scientific and scientific collaborations as well as attract outside collaborators to work together across disciplines. The results could impact resiliency and innovation both locally and globally.

Local collaborations

“When you lose energy as we did after Maria, not only does your grid go down but with it goes your health system, your communication, your transportation system, your food distribution system, your education system,” says Associate Professor of Social Sciences at the Mayagüez campus, Cecilio Ortíz-García, “But none of those realms, in non-emergency times, talk to each other or understand each other. It’s time to establish a platform for cross-communication.”

The University only has a few pictures of the classrooms because most places were difficult to get through and some were forbidden because of fungus contamination.

Ortíz-García is on the steering committee of the National Institute of Island Energy and Sustainability (INESI) at the University of Puerto Rico. INESI promotes interdisciplinary collaboration on energy and sustainability problems and has a network of 70 resources across the university’s 11 precincts. In the wake of Hurricane Maria, it has been able to help establish collaboration at local, community, and municipal levels as well as with some of the stakeholders, says Professor of Social Sciences at the Mayagüez campus, Marla Pérez-Lugo, who is also on the steering committee.

The absence of strong federal and central government involvement following Hurricane Maria has prompted organized innovation and resilience on local levels that was never expected Ortíz-García says. The mayor of San Sebastian pulled together volunteers who were certified electricians, ex power-utilities employees, retired employees, and others like private construction contractors that had heavy equipment. “They put those guys together and started electrifying neighborhoods on their own,” said Professor Ortíz-García.

Solving real-life problems

Ciencia Puerto Rico (CienciaPR) is a non-profit organization that promotes science communication, education, and research in Puerto Rico. They received a grant from the National Science Foundation to implement project-based science lessons on disaster-related topics. The middle school education program features lesson activities that are related to what’s happening in Puerto Rico as well as culturally relevant.

The first lessons implemented included how to properly wash hands when clean water is scarce and understanding the effect of the storm on the terrestrial environment.

Educators at the Boys and Girls Club of Puerto Rico learn how to use the Foldscope.

Each child is given a paper microscope and asked to conduct a research project to answer a question they have about how the storms have affected the environment. At the end of June, the students will share their findings with the community.

The project is funded by a RAPID grant, which is awarded for one to two years to respond to emergency or one-off events. The Foundation has awarded about 40 grants associated with Hurricane Maria, according to their website. Most of them are RAPID grants and about 25 percent of them have been awarded to scientists in Puerto Rico.

RAPID grants associated with Hurricane Maria have required INESI to adapt its vision, says Professor Pérez-Lugo. INESI’s basic mission is to look at Puerto Rico from a local perspective to insert local knowledge into the policy process. But the flood of effort coming from outside universities has required them to attempt to identify and coordinate those doing research and relief work in Puerto Rico. INESI initially counted 20 universities conducting research, but other initiatives and projects involving energy and the electric system have been identified since. In some cases, there were three or four teams from the same university working in Puerto Rico that were unaware of the presence of the other teams. “So, these universities found out about their colleagues through us,” said Pérez-Lugo.

The workers and researchers tended to be concentrated in only a couple of municipalities, leaving many areas neglected. INESI coordinated their efforts to avoid fatigue, to avoid saturation in some areas, and to distribute aid in a more just and equal way Pérez-Lugo says.

Updating approaches to disaster

Most classrooms at the University of Puerto Rico were filled with water, some with vegetation, and many with broken equipment.

According to Ortíz-García, INESI was founded prior to the arrival of Hurricane Maria in recognition of the flaws associated with the fragmented organization at the university. Like most universities, it is organized to accomplish the goals of teaching, research, and service, which is an organization best suited to the scientific processes of discovery, knowledge creation, and scientific inquiry. “But these are different times,” says Ortíz-García, “with problems that are not aligned with a fragmented, unidisciplinary approach.”

“But that’s an outdated approach because now we know that energy transitions are embedded in everything that society values, from water to health, to safety and security, and to food. So, multiple organizations will need to be involved to solve the problem and they need a common language to fix something.”

INESI has been working toward taking the University of Puerto Rico to the next level of university organization, with networks of interest and practice within and throughout interconnecting disciplines. “Instead of concentrating on a scientific development in one discipline, scientists need to concentrate on the effective design of solutions to issues that don’t belong to any discipline, like climate change,” says Ortíz-García.

Collaborative convergence platforms such as INESI can foster interdisciplinary dialogue and the generation of solutions for these issues. Now, inspired by the influx of representatives from other universities to Puerto Rico in the wake of Hurricane Maria, INESI wants to build a platform of platforms.

RISE Puerto Rico

A group representing an inter-university collaborative convergence platform will meet for a foundational catalyst workshop at the end of June. Twenty-seven people from ten universities have already accepted the invitation and will meet face to face for the first time.

“The platform that we’re looking to build here, we’ve already preliminarily named it RISE Puerto Rico, which stands for Resiliency through Innovations in Sustainable Energy,” Pérez-Lugo says.

Starting these dialogs now will go a long way, Ortíz-García says, in reorganizing academic environments toward finding the solutions necessary to fix these problems. “In addition, it can foster innovation in ways our own organizational structure could never, ever think of because you would have spin-off after spin-off of academic conversations not only with the scientists but also community and other stakeholders’ knowledge that is out there from leading these events themselves,” he says.

Córdova-Figueroa is optimistic about the research opportunities in Puerto Rico.

He would like to see many scientists from around the world take advantage of the myriad research opportunities available. “Come to Puerto Rico,” he says. “You will learn something great here.”

Dr. Kimber Price is a science communications graduate student at the University of California, Santa Cruz. Follow her on Twitter: @LowcountryPearl

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Understanding 1.5°C: The IPCC’s Forthcoming Special Report

Photo: IISD

The Intergovernmental Panel on Climate Change (IPCC) – an international body that develops non-policy prescriptive climate science assessments for decisionmakers – is currently compiling a Special Report that will provide information on what it would take to limit global warming to 1.5 degrees Celsius above pre-industrial levels. The report will also assess the climate impacts that could be avoided by keeping warming to this level, and the ways we can limit the worst impacts of climate change and adapt to the ones that are unavoidable. Report authors and government representatives will meet in Incheon, Republic of Korea from October 1-5 to review the report, with the report’s Summary for Policymakers due to be released on October 7 at 9 p.m. Eastern US time (October 8 at 10:00 local time (KST)). The report is slated to come out just as nations look towards revising the commitments they made to achieve the goals of the Paris Agreement.

The Paris Agreement is a worldwide commitment adopted in 2015 under the United Nations Framework Convention on Climate Change (UNFCCC) to reduce global warming emissions and limit the increase in global temperature to well below 2°C. More specifically, the Paris Agreement includes a goal of “Holding the increase in the global average temperature to well below 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5°C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change.” Small Island Developing States that are disproportionately vulnerable to global warming were pivotal in the inclusion of the 1.5°C goal.

It has been recognized that efforts beyond those spelled out in the commitments made to meet the goals of the Paris Agreement (the Nationally Determined Contributions) will be necessary to limit global warming to 1.5°C above pre-industrial levels. As a result, policymakers are interested in what it would take to achieve this goal, as well as the benefits and tradeoffs to consider as countries look to ramp-up their commitments. (This report will fulfill an invitation made by UNFCCC member countries including the U.S. during the adoption of the Paris Agreement for, “the Intergovernmental Panel on Climate Change to provide a special report in 2018 on the impacts of global warming of 1.5 °C above pre-industrial levels and related global greenhouse gas emission pathways.”) The report is thus intended to inform such deliberations and respective domestic and international climate policy.

Just preceding the invitation to the IPCC to produce the Special Report, UNFCCC member countries also decided, “to convene a facilitative dialogue among Parties in 2018 to take stock of the collective efforts of Parties in relation to progress towards the long-term goal referred to in Article 4, paragraph 1, of the Agreement and to inform the preparation of nationally determined contributions pursuant to Article 4, paragraph 8, of the Agreement.” The Special Report will be an important input into this global stock-take, which will be a prominent feature of the forthcoming UNFCCC Conference of Parties (COP24) in Poland in December of this year.

In addition to its Summary for Policymakers, the report will have five underlying chapters, as well as an introductory section, several break-out boxes, and frequently asked questions. The title of the chapters will be as follows:

  • Chapter 1: Framing and context
  • Chapter 2: Mitigation pathways compatible with 1.5°C in the context of sustainable development
  • Chapter 3: Impacts of 1.5°C global warming on natural and human systems
  • Chapter 4: Strengthening and implementing the global response to the threat of climate change
  • Chapter 5: Sustainable development, poverty eradication and reducing inequalities

Among other topics, the report will provide information on the global warming emissions reductions required to keep global warming to 1.5°C relative to the emissions reductions necessary to limit warming to 2°C. And it will compare the different paths nations can take to achieve these emissions reductions, including the opportunities and challenges that meeting the 1.5°C goal will present from socio-economic, technological, institutional, and environmental perspectives. The report will also assess the impacts that could be avoided if warming is kept to 1.5°C instead of 2°C, as well as the emissions reduction options relevant to keeping warming to 1.5°C and the options available to prepare for projected impacts. Furthermore, authors have been tasked with considering how to limit warming to 1.5°C together with sustainable development and poverty eradication efforts, and the implications of pursuing this goal for ethics and equity.

The report is being prepared by experts from a diverse array of countries and institutions, including from the United States. The IPCC does not conduct new science. Instead, authors reviewed the best available, peer-reviewed literature relevant for each chapter, and employed set criteria to characterize the evidence relevant to key topics, including the level of confidence, agreement, and uncertainty in the evidence base (as an example, here is a link to the criteria employed in the IPCC’s Fifth Assessment Report).

The IPCC’s Special Report on 1.5°C Warming has undergone a lengthy review process, including an internal review, and multiple expert and government reviews. For example, 2000 experts registered to review the First Order Draft, along with 489 government reviewers from 61 countries. The review process provides a mechanism for engaging and incorporating input from a diverse and inclusive set of experts. Authors are required to consider and respond to each comment received – responses that are then made publicly available.

The Union of Concerned Scientists will be reviewing the released documents on October 8th and posting a series of blogs that will cover key aspects of the report. Although the report is not public yet, the latest scientific literature (e.g. that assessed in the recent U.S. Climate Science Special Report) has only underscored the urgent need for action to limit the heat-trapping emissions that are fueling climate change. This new IPCC Special Report will almost certainly make this point even clearer, as it will show the world what can be avoided if we ramp down emissions now. There is too much at stake for humanity to indulge in further delay or inaction.

The Coal Bailout Nobody is Talking About

Photo: Mike Poresky/CC BY (Flickr)

If you are reading this blog, chances are you are either an energy economist, grid geek, or maybe my mother. Regardless, this administration seems intent on trying various coal bailout attempts. Hopefully, you’ve already read up on the high costs and low benefits to such bailouts, how the first attempt failed, and how they’re at it again. My latest research has uncovered that every month, millions of consumers are unwittingly bailing out coal-fired power plants to the tune of over a billion dollars a year.

Merchant vs monopoly

Before we dig into the numbers, let’s talk about the two types of utilities that own coal power plants.

Merchant utilities primarily rely on revenues from the competitive power markets to make money. Monopoly utilities, on the other hand, own power plants and directly serve retail customers. They are household names in the areas they operate, probably because they send those households a monthly bill; those bills are how those utilities make money.

The first part of this new research looked at how these two types of utilities operated in four large competitive power markets. These markets were designed such that power plants that are cheap to operate should run more often than plants that are expensive to operate.

My analysis looked at market prices for energy where every coal plant is located and calculated how often a plant would operate based on market prices alone. I then compared that “expected” value with the power plant’s actual operational data.

For the most part, the merchant power plants operated at or below the expected value. Power plants that were owned by monopolies, however, typically operated more than would be expected.

The expected vs actual value of several large utilities. You would expect the dots to fall along the diagonal line, but monopoly owned plants tend to overgenerate.

The stark difference begged for additional investigation, so I used hourly data to conduct a detailed analysis of each power plant to discover if power plants were operating at times when cheaper energy was available.

Billion-dollar bailout

Some utilities appear to be finding a way to undermine the competitive market structure that would have lower cost resource operate more and higher cost resources operate less. Expensive coal plants—which are objectively not competitive—are being operated in such a way that costs consumers money, reduces flexibility, and exacerbates existing pollution problems.

Based on my analysis, monopoly utilities appear to be running more expensive plants while depriving their customers of access to cheaper (and likely cleaner) sources of energy.

This new analysis builds on earlier work of mine that investigated this issue in the Southwest Power Pool, SPP, the market that covers several great plains states. My original analysis calculated that ratepayers were incurring a burden of $150 million a year from just a few power plants. The new analysis (which includes all coal units in SPP) indicates that the number is closer to $300 million a year, just for SPP.

I looked at four large electricity markets, SPP, ERCOT, MISO, and PJM. Together, these four markets span from New Jersey to North Dakota; from Texas to Virginia.

The latest results suggest that, across the four coal-heavy energy markets, coal-fired power plants incurred $4.6 billion in market losses over the past 3 years or $1.5 billion dollars in market losses each year. Most of these “losses” were incurred by power plants owned by monopoly utilities and are not absorbed by the investors or owners. Rather, those costs were likely covered by customers. Consequently, I estimate this practice places a least a $1 billion burden on utility ratepayers each year.

New spin on old news

The fact that so much coal-fired power is uneconomic is not new news. The financial woes of coal have been well documented by UCSRhodium Group and Columbia University, Bloomberg New Energy FinanceMoody’sBank of America Merrill LynchMJ BradlyRocky Mountain InstituteUBS, Synapse Energy Economics, IEEFA, and others.

My new research differs in two ways. First, it quantifies the financial impact on consumers when utilities opt for dirty and expensive fuel over cleaner and cheaper alternatives. And two, this work focuses on a very specific aspect of how these coal plants operate and draws a somewhat unintuitive conclusion:

Some coal-fired power plants might make more (or lose less) money by operating less. 

My analysis further suggests that, for at least some of the owners of these power plants, the current economic woes are self-inflicted.

Throwing good money after bad

If it costs $25 to produce a unit of energy and the market price is $30 per unit then it makes sense to operate that power plant and take that $5 margin and use it to pay down debt or other fixed costs.

If market prices stay at $30 the power plant keeps operating as much as it can and begins to pay down the fixed costs and eventually the revenues go to building up a profit. If the market price drops down to $20 and the unit keeps operating, then the owner’s profits begin to erode. The longer the owner does this the more the profits erode.

“Stop throwing away money,” is not something you’d expect to have to tell a corporation.

Some plants generate at a loss so often that they make it impossible for the power plant to make money.

If utilities allowed the market price to determine when to run, this wouldn’t happen. However, in the competitive markets I analyzed, power plant operators can choose to ignore price signals, and the owner can “self-commit” or “self-schedule,” effectively bypassing the market’s role as the independent system operator.

If a merchant-owned power plant does this, it does so at its own risk. But when monopoly utilities do this, customers bear the burden. Below is the list of the 15 power plants that each imposed a $100 million burden on rate-payers over the 3 year study period.

Power plants that imposed at least $100 million on rate payers. Note that many of the merchant-owned plants (highlighted in purple) burn waste coal and/or are “cogen” facilities.

The common excuses I hear

I’ve talked about this issue with advocates, economists, lawyers, engineers, market monitors, utility operators, and reporters. I like to take a moment to share some of the things I’ve heard in response to my analysis, and my response to them.

The most common: Aren’t these plants needed for reliability? 

Markets are designed to provide low-cost, reliable power. The idea that a power plant needs to bypass the market’s decision-making process and self-select (as opposed to market-select) is to presume that the markets are incapable of doing its job. Arguably, if the clearing price in a market is constantly below a power plant’s production costs, then, there were other resources available to reliably provide lower cost power. In some cases, the plants might be needed in some months but not others, like the municipal coal plant owner in Texas that realized it only made sense to operate in the summer months and decided to sit idle 8 months of the year. The municipality still provides electricity to their customers year-round, they just decided it didn’t make sense to burn $25/MWh coal in a $20/WMh market.

At the end of the day, this research was not designed to indicate or evaluate reliability and makes no judgment about the “need” for any of these plants for reliability purposes.

The most insulting: You just don’t understand how this works.

After the SPP report came out, SPP and utility officials challenged my conclusions (oddly, they did not contest my results). According to E&E news, that pushback included the specious argument that “[w]holesale electric rates do not directly correlate with retail electricity rates…” And, “[w]holesale electric rates also do not reflect other costs, like the price of ensuring the grid’s reliability, or utilities’ long-term fuel supply contracts”

Really? Retail and wholesale are different? Please, tell me more.

Of course retail rates are different, they include additional costs (for example, distribution system costs). Yes, regulated monopolies are allowed to recover prudent capital costs from the past, but we are only talking about operating costs.

None of these arguments changes the fact when wholesale prices are low, there is an opportunity for utilities to buy lower cost energy off the market and pass along savings to customers.

The silliest: I have the right to “self-supply.”  

When monopoly utilities joined competitive markets, they did so voluntarily. In fact, some had to jump through hoops to do it. These utilities often have a “least cost” obligation, meaning they are supposed to provide electricity to retail customers at the lowest reasonable cost. Now that there are resources available that are lower cost, utilities are desperate to retain their monopoly rights to supply their customers with resources they own. That’s just silly from any point of view other than the plant owner.

The most technical: Fuel contracts turn fuel costs into fixed costs.

This being the most technical excuse it is also the most complex; not easily handled in a few-hundred-word response. Maybe I’ll come back to this one in a future blog, but in the meantime…

Many coal-fired power plants enter into contracts for fuel which have “take-or-pay” provisions. Utilities claim this means there is effectively no cost to burning the fuel. First, most contracts can be re-negotiated. Fuel contracts I have reviewed are akin to a rental agreement: Yes, technically, you are locked in for a number of years, but typically there are ways to negotiate your way out. Second, the accounting logic they use to justify discounting the coal costs can produce sub-optimal results when companies fail to appropriately account for opportunity costs.

What’s next?

This research raises interesting questions, including:

  • Does this impact how we value energy efficiency and renewable energy?
  • Are power plants owned by monopoly utilities receiving de-facto ‘out of market’ subsidies?
  • When do fuel contracts and fuel cost accounting become imprudent?
  • Are inflexible coal units crowding out renewables on the transmission system?
  • Is coal partially to blame for negative energy market prices?

UCS wants allies, utilities, and decision makers to look at this question of operating uncompetitive coal plants without regard for the availability of lower priced energy. Utilities have the ability to stop engaging in this practice; if they don’t, regulators have agency to create (dis)incentives to help end it. If neither of those groups acts, consumer and environmental groups would seem well aligned to work together to stop it.

Sustainable FERC Project

In a Warming World, Carolina CAFOs Are a Disaster for Farmers, Animals, and Public Health

North Carolina hog CAFO in Hurricane Florence floodwaters, September 18, 2018. Photo: Larry Baldwin, Crystal Coast Waterkeeper/Waterkeeper Alliance

In the aftermath of Hurricane Florence, I’ve joined millions who’ve watched with horror as the Carolinas have been inundated with floodwaters and worried about the various hazards those waters can contain. We’ve seen heavy metal-laden coal ash spills, a nuclear plant go on alert (thankfully without incident), and sewage treatment plants get swamped. But the biggest and most widely reported hazard associated with Florence appears to be the hog waste that is spilling from many of the state’s thousands of CAFOs (confined animal feeding operations), and which threatens lasting havoc on public health and the local economy.

And while the state’s pork industry was already under fire for its day-to-day impacts on the health and quality of life of nearby residents, Florence has laid bare the lie that millions of animals and their copious waste can be safely concentrated in flood-prone coastal areas like southeastern North Carolina.

CAFO “lagoons” are releasing a toxic soup

The state is home to 9.7 million pigs that produce 10 billion gallons of manure annually. As rivers crested on Wednesday, state officials believed that at least 110 hog manure lagoons—open, earthen pools where pig waste is liquified and broken down by anaerobic bacteria (causing their bubblegum-pink color) before being sprayed on fields—had been breached or inundated by flood waters across the state:

The tally by the North Carolina Department of Environmental Quality is rising rapidly (it was just 34 on Monday). Perhaps not surprisingly, the state’s pork industry lobby group is reporting much smaller numbers: by Wednesday afternoon, the North Carolina Pork Council’s website listed only 43 lagoons affected by the storm and flood.

In any case, the true extent of the spills may not be known for many days, as extensive road closures in the state continue to make travel and assessment difficult or impossible.

The scale of North Carolina’s CAFO industry is shocking

In 2016, the Waterkeeper Alliance and the Environmental Working Group used federal and state geographical data and analyzed high-resolution aerial photography to create a series of interactive maps showing the locations and scale of CAFOs concentration in the state. The map below shows the location of hog CAFOs (pink dots), poultry CAFOs (yellow dots), and cattle feedlots (purple dots) throughout the state.

Waterkeeper Alliance and the Environmental Working Group used public data to create maps of CAFO locations in North Carolina in 2016. For more information and interactive maps, visit https://www.ewg.org/interactive-maps/2016_north_carolina_animal_feeding_operations.php#.W6KBLPZReUk.

Note the two counties in the southeastern part of the state, Duplin and Sampson, where the most hog CAFOs are concentrated—nearly as pink as a hog lagoon, these counties are Ground Zero for the state’s pork industry. In Duplin County alone, where hogs outnumber humans 40-to-1, the Waterkeeper/EWG data show there were, as of 2016, more than 2.3 million head of swine producing 2 billion gallons of liquid waste per year, stored in 865 waste lagoons. (Duplin County was also home to 1,049 poultry houses containing some 16 million birds that year.)

The state’s CAFOs harm communities of color most

“Lagoon” is a curious euphemism for a cesspool. Even without hurricanes, these gruesome ponds pose a hazard to nearby communities. In addition to the obvious problem of odor, they emit a variety of gases—ammonia and methane, both of which can irritate eyes and respiratory systems, and hydrogen sulfide, which is an irritant at very low exposure levels but can be extremely toxic at higher exposures.

These everyday health hazards hurt North Carolinians of color most of all. To pick on Duplin County again, US Census figures show that one-quarter of its residents are black and 22 percent are Hispanic or Latino. And a 2014 study from the University of North Carolina at Chapel Hill found that, compared to white people, black people are 54 percent more likely to reside near these hog operations, Hispanics are 39 percent more likely, and Native Americans are more than twice as likely.

What does all that mean for health and environmental justice? Residents near the state’s hog CAFOs have complained for years of sickening odors, headaches, respiratory distress, and other illnesses, and have filed (and begun winning) a series of class-action lawsuits against the companies responsible for them.

Just this month, researchers at Duke University published new findings on health outcomes in communities close to hog CAFOs in the state. They found that, compared with a control group, such residents have higher rates of infant death, death from anemia, and death from all causes, along with higher rates of kidney disease, tuberculosis, septicemia, emergency room visits and hospital admissions for low-birthweight infants. (Read the full study or this review.)

CAFO damage from Florence was predictable…and will get worse

Releases of bacteria-laden manure sludge from CAFO lagoons in flooding like we’re seeing this week compound the day-to-day problem, and they’re inevitable in a hurricane- and flood-prone state like North Carolina. Between 1851 and 2017, 372 hurricanes have affected the state, with 83 making direct landfall in North Carolina. Hurricane Floyd in 1999 and Hurricane Matthew in 2016 wreaked havoc similar to what we’re seeing this week.

As you can see on the map below, Florence dumped between 18 and 30+ inches on every part of Duplin County.

http://www.nc-climate.ncsu.edu/climateblog?id=266

It’s not surprising that flooding from such an event would be severe. And while the North Carolina Pork Council called Florence “a once-in-lifetime storm,” anyone who’s paying attention knows it’s just a matter of time before the next one.

Millions of animals are likely drowned, starved, or asphyxiated

In addition to the effects on communities near North Carolina’s CAFOs, it’s clear that Hurricane Florence has caused tremendous suffering and death to animals housed in those facilities. Earlier this week, poultry company Sanderson Farms reported at least 1.7 million chickens dead, drowned by floodwaters that swamped their warehouse-like “houses.” Some 6 million more of the company’s chickens cannot yet be accounted for. Overall, the state Department of Agriculture and Consumer Services on Tuesday put the death toll at 3.4 million chickens and turkeys and 5,500 hogs, but those numbers may very well rise.

A major reason we don’t yet know the full extent of animal deaths in North Carolina’s CAFOs is that road closures due to flooding has cut off many of the facilities, preventing feed deliveries and inspections. Many animals likely also died in areas that experienced power failures due to the storm. According to this poultry industry document, a power outage that interrupts the ventilation system in a totally enclosed poultry CAFO can kill large numbers of birds by asphyxiation “within minutes.”

North Carolina farmers face staggering financial losses and likely bankruptcies

And what about the farmers? Many of the nation’s hog and poultry producers are in already in a predicament. Corporate concentration has squeezed out many independent farmers, meaning more operate as contractors to food industry giants like Smithfield and Tyson. In the US pork industry, contract growers accounted for 44 percent of all hogs and pigs sold in 2012. The farmers have little power in those contracts, and an early action of the Trump administration’s USDA served to remove newly-gained protections against exploitation by those companies. The administration’s trade war isn’t helping either.

As one expert in North Carolina put it as Hurricane Florence approached:

A farmer (who operates a CAFO) has very little flexibility. They take out very large loans, north of a million dollars, on a facility that is specifically designed by the industry, as well as how the facility will be managed. Remember that 97% of chickens and more than 50% of hogs are owned by the industry. These farmers never even own the animals. But if the animal dies, and how to handle the waste, that’s on the farmer. That’s their responsibility.

I know many individual farmers who do the best they can, who work as hard as they can, who treat their animals with respect. But there’s only so much they control. They can’t control the weather. They can’t control the hurricane. These farmers are part of an industry that says, for the sake of efficiency, you have to put as many animals as possible into these facilities.

Post-Florence, these contract farmers are likely to receive inadequate compensation for the losses of animals in their care. A series of tweets this week by journalist Maryn McKenna, who has studied the poultry industry, illuminates the issues:

So, as the waters recede, many hog and poultry farmers are about to find themselves responsible for a ghastly cleanup job. Imagine returning home to find thousands of bloated animal corpses rotting in the September sun. They they were your livelihood, and now they’re not only lost, but an actual liability you must pay to have hauled away.

Public policies should encourage sustainable livestock production, not CAFOs

And so it goes for farmers in today’s vertically-integrated, corporate-dominated, CAFO model. But it doesn’t have to be this way. Public policies can give more power to livestock farmers in the marketplace, protect animals and nearby communities from hazards associated with CAFOs, and facilitate a shift to more environmentally and economically sustainable livestock production practices.

If Hurricane Florence teaches us anything, it’s that flood-prone coastal states like North Carolina are no place for CAFOs. At a minimum, the state must tighten regulations on these facilities to protect public health and safety. A 2016 WaterKeeper Alliance analysis found that just a dozen of North Carolina’s 2,246 hog CAFOs had been required to obtain permits under the Clean Water Act, with the rest operating under lax state regulation. The state and federal government should also more aggressively seek to close down hog lagoons and help farmers transition to more sustainable livestock practices or even switch from hogs to crops. A buyout program already exists but needs much more funding.

In the meantime, the federal farm bill now being negotiated by Congress also has a role to play. At least one farm bill program, the Environmental Quality Incentives Program, or EQIP, has been used in ways that underwrite CAFOs. In a 2017 analysis of FY16 EQIP spending, the National Sustainable Agriculture Coalition noted that 11 percent ($113 million) of EQIP funds were allocated toward CAFO operations, funding improvements to waste storage facilities and subsidizing manure transfer costs. And the House version of the 2018 farm bill could potentially increase support for CAFOs by eliminating the Conservation Stewardship Program—which incentivizes more sustainable livestock practices and offers a 4-to-1 return on taxpayer investment overall—and shifting much of its funding to EQIP.

The post-Florence mess in North Carolina illustrates precisely why that’s a bad idea. Particularly in a warmer and wetter world, public policies and taxpayer investments should seek to reduce reliance on CAFOs, not prop them up.

Utilities Look Toward a Clean Energy Future, Yet the Administration Keeps Looking Back

Coal—and the president’s ill-conceived plan to bailout the industry—is taking up a lot of bandwidth in discussions around national energy policy. As the president and the coal industry continue to rely on dubious arguments to justify the idea of keeping economically struggling coal plants afloat, we began to wonder: what are electric utilities doing on the ground? How are they approaching the question of future investments in technology?

Today I’m handing my blog over to our 2018 UCS Schneider Fellow Eli Kahan, who has studied how utilities are planning for future electricity needs, what they are doing in terms of new investments in low-carbon generation sources, and how that compares to their stated goals.

Market pressures

In the past decade, due to advances in horizontal drilling technology and the fracking boom, domestic natural gas prices have plummeted, incentivizing many power plant owners to shift from coal to natural gas. Moreover, in the past few years, renewable energy such as wind and solar have continued to grow at record rates as these renewables have become highly economically competitive with traditional forms of electricity generation. A look at Lazard’s 2017 Unsubsidized Levelized Cost of Energy report shows wind as the world’s cheapest source of energy with utility scale solar not far behind.

As such, this rapid decline in coal generation has been driven primarily by market forces. While the administration’s proposed bailout might buoy a few coal plants destined for retirement in the next few years, this crutch is unlikely to keep coal afloat in a time of ever-falling costs of renewable energy, to say nothing of the urgent need to act on climate change.

Another key reason this bailout is unlikely to stick is due to a recent surge in announcements of utility goals to reduce CO2 emissions. Who is setting these targets? What is driving them? And how are utilities planning to meet these goals?

Utilities around the country have announced significant emissions reduction goals.

The targets

To begin answering these questions, I gathered data on more than 100 of the nation’s largest electric utilities and parent energy companies—more than 30 of which have set quantitative targets for reducing CO2 emissions, achieving higher percentages of renewable energy, and/or completely moving off of coal. Together, these companies, including 7 of the 10 largest electric utilities by market value in the country, account for nearly 40% of 2016 US electricity sales. These utilities are showing awareness of the science, costs, and their consumers’ concerns in setting sights on a lower carbon future.

While these goals come in many shapes and sizes, they all share one thing in common: additionality—all of them voluntarily exceed state goals and mandates. Moreover, the majority of these targets have been set in the past couple years—a time when the Clean Power Plan has been on the chopping block and during a fossil-fuel-friendly presidency.

Furthermore, as the map below shows, this is a national trend—these targets are not restricted to just a few leading states but rather can be found all around the country, even in places that rely heavily on coal.

Electric Utility Companies with targets deemed to provide additionality. Not shown: Avangrid, Engie, Great River Energy, MidAmerican Energy, Minnesota Power, NextEra Energy, NRG, Tennessee Valley Authority. Since the analysis done for project, California’s SB100 has nullified the additionality from PG&E’s and LADWDs’ goals.

What else is driving this trend?

There’s little doubt that falling prices for natural gas and renewables have aligned cutting emissions with cutting costs, and the success of state renewable portfolio standards (RPSs) and greenhouse gas (GHG) emissions reductions targets has played a role in breaking up the inertia on decarbonization. But according to the utilities themselves, there exists a whole host of additional motivating factors.

For AEP’s Appalachian Power President, Chris Beam, the motivation is customer preferences:

“At the end of the day, West Virginia may not require us to be clean, but our customers are”  –Chris Beam

For Berkshire Hathaway Energy’s vice president for government relations, Jonathan Weisgall, it’s customer input focusing on the role of corporate commitments:

“We don’t have a single customer saying, ‘Will you build us a 100 percent coal plant?’…Google, Microsoft, Kaiser Permanente — all want 100 percent renewable energy. We’re really transitioning from a push mandate on renewable energy, to more of a customer pull.” –Jonathan Weisgall

For DTE, it is a science-based response to the issue of climate change:

“Through our carbon reduction plan DTE Energy’s is committed to being a part of the solution to the global climate crisis. There is broad scientific consensus that achieving 80 percent carbon reduction by 2050 will be necessary to begin to limit the global temperature increase below two degrees Celsius over preindustrial levels” –DTE 2018 Environmental, Social, Governance, and Sustainability Report, p.3

Even coal-heavy American Electric Power (AEP) is moving toward a clean energy future, recognizing the value of investing in renewable energy. AEP serves customers in 11 states—including Ohio, Indiana, Kentucky, Virginia, and West Virginia. Nick Akins, AEP’s CEO, is skeptical of the administration’s plans to bailout certain coal and nuclear plants. Although his position is a bit more nuanced—claiming that coal remains important to the reliability and resiliency of the grid”—Akins is clear-eyed about the future, charting AEP’s path toward a 60 percent reduction in carbon emissions by 2030, and 80 percent by 2050:

“Our customers want us to partner with them to provide cleaner energy and new technologies, while continuing to provide reliable, affordable energy.” –Nick Akins

How are utilities meeting these goals?

For the past decade, utilities have been cutting GHG emissions and costs by shedding coal and transitioning onto natural gas. Natural gas is sometimes seen as a “transitional fuel” on the path to a clean energy future, as it is known for producing 50-60 percent less CO2 than coal does in combustion. Southern Company, shown below, provides a particularly illustrative example of this coal to natural gas switching.


However, natural gas is not particularly clean, especially when considering available renewable energy alternatives. And with the recent dramatic cost declines in wind and solar, companies like MidAmerican Energy, shown above have realized they can skip the “transition fuel” and get a head start on investing in a more long-term energy solution.

Over and over, it’s the same story:

“Retiring older, coal-fueled units. Building advanced technology natural gas units. Investing in cost-effective, zero-carbon, renewable generation” – WEC Energy Group

“The company expects to achieve the reductions through a variety of actions. These include replacing Kentucky coal-fired generation over time with a mix of renewables and natural gas” – PPL

“Traditionally, our generation portfolio has depended on coal, but we are transitioning our energy supply away from coal to rely more on renewable energy and natural gas generation as backup. From 2005 through 2026, we will retire more than 40 percent of the coal-fueled capacity we own under approved plans, and if regulators approve our proposed Colorado Energy Plan in 2018, we will retire even more.” – Xcel

In short, utilities around the country are retiring coal, and investing in natural gas and renewables. But it’s important to emphasize that there is a serious risk of an overreliance on natural gas. Natural gas is subject to price volatility (putting consumers at risk of higher prices) and is inconsistent with long-term deep decarbonization targets. Utilities prioritizing clean, cost-effective renewable energy instead are protecting their customers from gas-related risks.

Connecting the dots

We’ve seen a recent flurry of utility announcements of decarbonization goals and renewables targets. In the last decade, natural gas has steadily eaten away at coal’s market share of the nation’s electricity production. And the costs of renewables continue to fall dramatically, making them both clean and cost-effective. This trend shows no sign of changing. Some utilities such as Consumers Energy have even explicitly pledged to drop coal altogether.

In that context, how does the administration’s proposed bailout, which would cost billions to keep a few of the nation’s oldest, dirtiest, most expensive coal plants on line for another couple years, make sense? Grid operators and federal regulators have consistently said that near-term retirements pose no threat to grid reliability. Reserve margins—that is, how much extra electricity generating capacity is available beyond expected peak demand levels—are sufficient in most regions of the country. And besides, studies show renewables are diversifying the electricity mix, making the electricity grid more reliable and resilient.

We see that some utilities are making decisions prudent for their long-term planning: smart investments, achievable decarbonization targets, and a mix of energy sources in their portfolios. But a bailout of this nature will send utilities—along with their investors—scrambling to square established goals with the administration’s backward steps.

Even with the temporary crutch, coal has no long-term future, as it continues to be displaced by cheaper and cleaner forms of energy. Instead of propping up a dying industry, our collective interests would be better served by ensuring that the miners and coal plant workers—whose livelihoods will be threatened by this transition—have new opportunities to find good jobs with family-supporting wages. We can and should look to our nation’s utilities and their clean energy targets for a clearer vision forward.

Hurricane Florence: One Week Later Here’s What We Know and Here’s What’s Next

Homes and businesses are surrounded by water flowing out of the Cape Fear River in the eastern part of North Carolina Sept. 17, 2018, in the aftermath of Hurricane Florence. (U.S. Army Photo by Staff Sgt. Mary Junell)

On the morning of September 14, Hurricane Florence made landfall near Wrightsville Beach, North Carolina, bringing with it record storm surge and torrential, historic amounts of rain. A week later, communities across the Carolinas are struggling with the aftermath. At least 42 people have lost their lives thus far. Heavy, lingering rainfall has caused rivers to rise for days after the storm, leading to catastrophic flooding including in inland areas. Here’s what we know so far and what we can expect in the weeks and months to come.

Six things we know about hurricane Florence’s impact so far:

1) Hurricane Florence is still a dangerous unfolding disaster. It’s important the people in the Carolinas continue to pay close attention to warnings from state emergency management agencies and other local authorities and not return to flooded areas until they are deemed safe. Rivers are still rising and new areas can get flooded. The National Weather Service Newport/Morehead City warned that the Neuse River will likely not crest until this evening at the earliest. Water contamination is a real health so please follow the CDC’s advice and heed local water advisories if you live in one of the affected areas.Source: NWSWPC

2) The advance projections for Hurricane Florence proved remarkably accurate and were very helpful in informing emergency preparedness and evacuation efforts. The National Hurricane Center was predicting for days that landfall would likely be along the North Carolina coast, and likely the southern part of the coastline. The forecast also made clear that the there would be record storm surge and this was borne out with tide gauges in Beaufort and Wilmington recording their highest-ever levels.

Most importantly, it was clear that one of the biggest dangers from Florence was that it would bring heavy rainfall and stall over the area for days, significantly raising flooding risks especially inland—a sad reality that many communities are now living through. Record rainfall was experienced in a number of places, including Elizabethtown, NC which saw 35.93 inches of rain and Marion, SC which saw 34 inches. Rivers are just cresting in many places, including Kinston, Lumberton, Fayetteville and Lumberton. Several rivers including the Cape Fear, Pee Dee, and Trent rivers, broke high water records. (Watch this dramatic and sobering visualization from the USGS of the flooding as Florence moved through North Carolina.)

Soldiers of 252nd Armored Regiment and 230th Brigade Support Battalion shuttle people, children, and their pets across high water on Highway 17 between Wilmington and Bolivia N.C. on September 18, 2018. (Photo by Sgt. Odaliska Almonte, North Carolina National Guard Public Affairs)

3) Despite clear warnings about the risks of inland flooding, many people were still caught unawares or did not have the resources to evacuate, and in some cases that contributed to loss of life. Too many people thought the worst of the storm was over by the day after it hit—when that turned out to just the beginning of the flooding in many inland areas. Drivers ventured onto roads that looked dry but were quickly overwhelmed by flash flooding. Evacuation is costly and not everyone can afford it. Unfortunately, the data also show that far too few homeowners in the path of Florence—especially in inland areas— were carrying flood insurance, which could leave them struggling financially as they try to rebuild their lives.

4) Hurricane Florence is a very costly storm, likely to rank among the top ten most costly in the United States. Early estimates of the property damage costs of Florence from Moody’s are in the $17 to 22 billion range, although that could rise. AIR Worldwide’s initial estimate of the insured losses just from the wind and storm surge, without accounting for the heavy precipitation, ranges from $1.6 to $4.7 billion. These costs do not include the damage to infrastructure, including major highways (see this stunning drone footage of I-40 turned into a river, for example) or dams; or payouts from the National Flood Insurance Program.

In Robeson County, sections of I-95 at the Lumber River remain under water in the wake of Hurricane Florence. Credit: NC DOT.

5) Early fears about the risks of coal ash ponds leaking toxic wastes and hog lagoons flooding and contaminating waterways have become realities. Initial reports show that some of Duke Energy’s coal ash ponds near the H.F. Lee coal plant have been breached, potentially contaminating the Neuse River near Goldsboro. A coal ash pond at the Sutton power plant near Wilmington has also been flooded, potentially contaminating the Sutton lake, a public lake. Meanwhile, the NC Department of Environmental Quality data show that over a hundred hog lagoons are either already discharging waste into waterways or are in danger of doing so. The Waterkeeper Alliance is tracking these hog lagoon and coal ash spills.

6) Low-income communities, communities of color, and rural communities have been particularly hard-hit by the flooding from Florence. News reports detail Florence’s impacts on public housing; on livelihoods of hourly wage workers; and on the rural poor. Many communities that were affected by flooding from Hurricane Matthew two years ago have found themselves again in the midst of devastating flooding. In comments earlier this week, Governor Roy Cooper of North Carolina said, “One thing that this storm puts a spotlight on is the issue of affordable housing, which is there even without a storm…” He went on to say “We’re going to approach this rebuilding effort with an emphasis on affordable housing.” Families around the state will be counting on this promise to be fulfilled.

Looking ahead to recovery and rebuilding

Ahead of Hurricane Florence’s landfall, President Trump issued disaster declarations for North Carolina and South Carolina and an emergency declaration for Virginia. This has authorized federal disaster assistance, including coordination from FEMA, to help supplement state, local and tribal emergency response and recovery efforts.

FEMA teams are on the ground in the affected states and a list of resources are available here. The National Guard, the US Coast Guard, the US Army, FEMA teams, NC emergency management personnel, nonprofits and thousands of volunteers are working to help safely evacuate and shelter people. Individual assistance programs for families and public assistance for state, local and tribal entities are available.

A Coast Guard Air Station Clearwater MH-60 Jayhawk aircrew searches for survivors of Hurricane Florence in Elizabeth City, North Carolina, Sept. 18, 2018. Credit: U. S. Coast Guard photograph by Auxiliarist Trey Clifton.

In the weeks to come, Congress will also need to step in with supplemental disaster aid, as has happened with previous major disasters. It will be critical for this aid to flow quickly to the communities that are hardest hit and have the fewest resources to cope. A major concern is that the public housing stock in many locations has taken a hard hit. Affordable housing is already scarce, and this hurricane will make it even more difficult for families looking for a safe place to live. Unfortunately, in past storms a lack of affordable housing has forced some to leave communities and move far away; or incur hardships because they had to move to places further from jobs and schools. The Department of Housing and Urban Development’s (HUD’s) Community Development Bock Grant-Disaster Recovery (CDBG-DR) funds are critical to rebuilding resilient and affordable housing where people need it most. Congress must allocate adequate funds to this program.

FEMA funding is also vital for communities’ rebuilding efforts. It’s important to ensure that rebuilding is done in a resilient way that will help protect homeowners and communities from future storms. This is also a good time to make funding for voluntary home buyout programs available so homeowners who live in areas at high risk of flooding can choose that option and move to safer ground. Recovery will take a long time and we can’t lose sight of that reality even after the storm drops out of the headlines. Yesterday marked one year since Hurricane Maria hit Puerto Rico—and it’s clear there is still so much to do to help communities get back on their feet. Some families in North Carolina were still waiting for federal assistance in recovering from Hurricane Matthew when Florence hit.

A resilient future must take account of climate change and equity considerations

It’s unmistakable that climate change is contributing to the risk of more intense hurricanes and worsening flooding. Higher sea levels and increased heavy rainfall exacerbate the risks of catastrophic flooding. The human and economic toll of these extreme events is high. And even in the absence of storms, sea level rise is worsening tidal flooding and is a grave risk to coastal communities.

Meanwhile, as we’ve seen repeatedly with recent hurricanes—Katrina, Harvey, Maria, Irma and now Florence—low income communities and communities of color bear the brunt of the harmful impacts in the wake of disasters.

As communities recover and rebuild from these terrible disasters, we must keep these facts front of mind and ensure that we’re building for a more climate-resilient future for all.

If you would like to support local recovery efforts for Hurricane Florence, please consider these resources assembled by frontline communities on the ground: A Just Florence Recovery  Thank you.

Photo by Sgt. Odaliska Almonte, North Carolina National Guard Public Affairs NC DOT U. S. Coast Guard photograph by Auxiliarist Trey Clifton/Released.

Sea Level Rise: New Interactive Map Shows What’s at Stake in Coastal Congressional Districts

A new interactive map tool from the Union of Concerned Scientists lets you explore the risk sea level rise poses to homes in your congressional district and provides district-specific fact sheets about those risks. Explore the interactive map.

No matter where you live along the coast, chances are that rising seas will begin to reshape your community to one degree or another in the coming decades. Communities that want to be prepared for the changes to come will need representatives in Congress who will advocate for the research, funding, and policies we need to address sea level rise and coastal flooding head-on. As we head into the midterm elections this fall, this tool provides a resource for both visualizing your community’s future as sea level rises and engaging with congressional candidates around the issue of climate change.

In this post you’ll learn how to explore this tool, how to get facts about sea level rise specifically for your congressional district, and how to take action within your community in light of the upcoming elections.

Explore how homes in your congressional district will be affected by sea level rise

The mapping tool is fairly simple. Clicking on any coastal congressional district in the contiguous United States will bring up information on the number of homes at risk of chronic inundation–or flooding, on average, every other week–as sea level rises. You’ll also get information about how much those at-risk homes are collectively worth, an estimate of the number of people living in those homes, and their current contribution to the property tax base.

Each district also has an accompanying district-specific fact sheet with statistics and information about chronic inundation risks. You can explore both near-term and long-term projections for a scenario with relatively rapid sea level rise and one with more moderate sea level rise.

While some districts have more homes at risk than others, every coastal district faces some degree of risk. New Jersey’s 2nd District– encompassing roughly the southern quarter of the state, including Atlantic City, Ocean City, and Cape May–is among the most exposed districts in the country, with more than 45,000 homes at risk of chronic inundation within the next 30 years.

Florida’s 26th District, which covers the southernmost tip of Florida and the Florida Keys, is also highly exposed with more than 12,000 homes at risk of chronic inundation by 2045.

More than 45,000 of the existing homes in New Jersey’s 2nd District are at risk of chronic inundation in the next 30 years.

 

In a state with some of the highest overall exposure to sea level rise, Florida’s 26th District stands out as being acutely at risk of chronic inundation.

Fact sheets available for every coastal Congressional district in the lower 48

With more than 10,000 homes at risk of chronic inundation in South Carolina’s 1st District, there’s a lot at stake. Our district-specific fact sheets can help you to assess just how much is at risk in your district as sea level rises.

When you click on any district in the map, the accompanying pop-up window includes a “Learn more” link, which brings you to a two-page fact sheet for that district. Included in each fact sheet is information about the number and value of homes at risk over the near-term (by 2045) and long-term (by 2100). There are also statistics about the percentage of homes that could potentially avoid chronic inundation if we limit future warming to below 2 degrees Celsius and future loss from land-based ice is limited.

The second page of the fact sheet highlights the implications of chronic flooding more broadly, and includes recommended policies for local, state, and federal policymakers.

Candidates running for Congress in coastal districts need to know the risks of rising seas

This tool enables people and policymakers along the coast to better understand when and to what extent sea level rise and coastal flooding will impact their communities. But what we do with that understanding is critical, particularly when it comes to ensuring that coastal congressional candidates fully recognize and acknowledge the risk, and have a plan for addressing it.

Here are four ways you can take this information to the candidates in your district—and ask them what they’re going to do about it:

  1. Reach out to candidates on social media. If you’re on Twitter, tweet at the candidates in your district. Include a key fact or two on rising seas in your district, link to the map or fact sheet, and ask them what their plans are to address the issue. Make sure you include candidates’ Twitter handles in your tweet so that candidates or their staff see it—you can find information about candidates on your ballot, including their Twitter handles, here. (Note that Twitter is including a special election label on each candidates’ official account to help you verify the correct Twitter handle to include.)
  2. If you’re on Facebook, follow candidates’ Facebook pages and comment on posts that can be connected to the risks of rising seas (property development, community protection, etc.), or create your own Facebook post highlighting the risks to your congressional district and share it.
  3. Attend a candidate forum or event. Ask candidates about their plans to address sea level rise and climate change. Cite the facts about homes in your district that are at risk of chronic flooding and ask candidates how they will support your community in efforts to build resilience to flooding. Because this problem will not be limited just to your community, ask about candidates’ plans to advocate for reductions in global warming emissions at the federal level, knowing that nationwide more than 80 percent of the homes at risk of chronic inundation this century could potentially avoid such a fate if we were to rapidly reduce emissions and limit future warming. You can also email candidates directly. The official web sites for most candidates includes “Contact Us” information, which typically provides an email form, address, or other way to write directly to the candidate.
  4. Write a letter to the editor for your local paper. Candidates monitor local news sources, so writing a letter to the editor (LTE) can be a great way to let them know about the issues that are important to you and your community, including rising seas. And including specific statistics about homes at risk of chronic inundation can help your LTE pack an extra punch and make it more likely to be published. Papers don’t publish every LTE they receive, but your chances are better if you’re writing one in response to an article the paper already published. You do have to be quick in your response–you don’t want days to go by between the original article and your LTE. But the good news is that LTEs are usually required to be 200 words or less, or 1 to 2 paragraphs, so the writing usually goes pretty quickly. Check with your paper about its specific requirements when it comes to submitting LTEs.

We hope that you find this new tool useful and look forward to hearing how you’re using it!

California Ready to Take Action on Clean Transportation after Climate Summit

With last week’s Global Climate Action Summit in San Francisco all wrapped up, it’s time to get down to the business of turning words into actions.  And next week, California is poised to do just that.  The California Air Resources Board agenda for next Thursday and Friday is chock-full of transformative policies that, if adopted, will accelerate deployment of electric cars and transit buses, increase electric charging and hydrogen refueling infrastructure, bring more low carbon alternatives to diesel and gasoline to the state, and ensure consumers in California and the 12 other states that follow California’s standards continue to have cleaner, more efficient vehicle choices.

Transportation emissions – the pollution from cars, trucks, buses, planes, ships and trains – are proving to be stubborn.  They’ve been increasing and becoming a larger portion of economy-wide emissions. They are now over 40 percent of California’s climate pollution. They are stubborn in part because vehicles stay on the road for a long time.  So even though standards that bring more efficient gasoline vehicles and EVs to market are very effective, they only apply to new vehicles. And new cars aren’t purchased like cell phones. Cars can last 15 years or more which means replacing all the cars on the road today with new ones takes time. Looking beyond passenger vehicles is also essential.  About 70% of transportation emissions in California are from passenger cars and trucks. The rest come from other types of vehicles and the fuels they burn.

California transportation emissions are more than 40% of the state’s total and are on the rise

Source: California’s Emissions Trends Report 2000-2016

There is no silver bullet solution policy on transportation, so a combination of coordinated and complementary policies is our best bet. The issues before the California Air Resources board meeting this month demonstrate this multi-prong approach in action.  Here are three of them:

  1. Extension of the Low Carbon Fuel Standard to 2030

The Low Carbon Fuel Standard requires gasoline and diesel fuel providers to reduce the carbon content of the fuel they sell in California. The current standard requires reducing the carbon intensity by 10 percent by 2020.  The board is set to vote on September 27th to strengthen the standard to require a 20% reduction in carbon intensity by 2030.  What’s the big deal?  This policy isn’t just about blending lower carbon biofuels like ethanol or renewable diesel into petroleum-based fuels. It’s also about expanding cleaner fuel choices like electricity and hydrogen that are needed to power zero emission vehicles.

The board isn’t just considering raising the bar on this policy, but considering some important changes designed to accelerate deployment of electric vehicle solutions, including:

Establish a statewide rebate program for electric vehicles funded by the clean fuel credits earned through vehicle charging. This comes at a critical time when some companies like Tesla and GM are starting to hit the cap on the federal EV tax credit.

Support electric vehicle charging and hydrogen fueling station deployment by providing financial incentives to station developers. This will help accelerate investments and help get California on the path to reach Gov Brown’s goal of 250,000 vehicle chargers and 200 hydrogen stations by 2025.

My colleague Jeremy Martin explains all of this in his recent blog post about how the Low Carbon Fuel Standard is clearing the roadblocks to electric vehicles. But the bottom line is that the Low Carbon Fuel Standard ensures that the fuels powering our transportation system become cleaner over time and, in the process, provides direct incentives for the clean vehicles and fueling infrastructure we need to make it happen.

  1. Requiring electric transit buses

Ever ride on a battery electric transit bus? If you’ve ridden a bus in China, the answer is likely ‘yes’. They’ve deployed more than 400,000 electric buses over the last few years. Modern battery electric and fuel cell powered buses are starting to gain traction in the U.S. and several transit agencies are making moves to deploy the technology. The Innovative Clean Transit regulation being heard by CARB on September 28th is aimed at accelerating that transition and making every bus in California either hydrogen or electricity powered by 2040.  That seems like a long way off, but that means transit agencies need to start buying electric buses now, and before 2030, 100% of their new bus purchases will need to be zero tailpipe emission buses.  This regulation will ensure that transit agencies in California are all moving forward together and transit riders around the state get the benefits of a quieter, cleaner bus ride. And the communities these buses operate get the benefit  of zero-tailpipe emissions . It will also help further advance electric drive in the heavy-duty vehicle sector paving the way for more electric trucks.

My colleague Jimmy O’Dea covers the finer details in his recent blog post and UCS’s recent Got Science Podcast on electric buses.

  1. Defending California clean car standards from Trump administration attacks

California has its own vehicle standards for cars and trucks, which 12 other states and the District of Columbia follow. California has had vehicle emission standards for decades, bringing huge benefits to the state as well as other states that follow the same rules. The rest of the country as a whole has also benefited as clean car technology, driven by California’s leadership, (the catalytic converter comes to mind).  The federal clean car standards are currently very similar to California’s standards and, as a result, California has accepted automaker compliance with federal standards as compliance with their own.

The board is proposing a change to California vehicle standards to further clarify that California will only accept compliance with the federal standards as they are currently written.  This is not a change in policy. California never signed-up to throw its authority to regulate vehicle emission out the window by accepting compliance with federal standards, whatever they may be.  And now that the Trump administration has made their intentions to freeze the standards in place at 2021 levels clear, California is simply clarifying that California standards will indeed be enforced.

Ideally, federal and California standards would remain aligned and continue to push forward on making new cars and trucks cleaner, more efficient and more affordable to drive. But barring an unforeseen change in the Trump administration’s anti-science agenda, that seems unlikely.  Making this regulatory language clarification makes it crystal clear that California intends to exercise its right to protect its residents from car and truck pollution as it always has.

The way forward

As with any change there is resistance. Oil companies have long attacked the low carbon fuel standard and automakers have resisted vehicle standards for decades. Many transit agencies are cautious about making the shift to electric buses.  But make no mistake: these changes are feasible and they are necessary if we are to succeed in preventing the worse consequences of climate change. The proposals before the Air Resources Board are based on extensive analysis and have been thoughtfully developed and deliberated and should be advanced.

There are over 25 million cars on the road in California – the vast majority of which are filled up with gasoline or diesel.  Transitioning to a clean, modern, low-emissions transportation system isn’t going to be easy.  There’s just no “one and done” strategy.  Each of the items before the board next week are substantial on their own and taken together they are a big step forward in reshaping California’s transportation system to deliver the clean air and stable climate California needs, while setting an example the rest of the country and the world can benefit from and follow.

Public domain

Here’s What Agriculture of the Future Looks Like: The Multiple Benefits of Regenerative Agriculture Quantified

Crops and livestock integrated in a regenerative agricultural system. Photo: Farmland LP

At the Union of Concerned Scientists, we have long advocated agricultural systems that are productive and better for the environment, the economy, farmers, farmworkers and eaters than the dominant industrial system. We refer to such a system as our Healthy Farm vision. Based on comprehensive science, we have specified that healthy farm systems must be multifunctional, biodiverse, interconnected and regenerative.

The scientific case for agricultural systems that renew rather than diminish resources is comprehensive, and research demonstrates the productivity and agronomic feasibility of such systems. Yet, economically viable real-world examples are necessary to spur acceptance and adoption of such schemes. Further, we need to overcome the limitations of economic thinking and measures that were developed in the 19th century—when it seemed that the Earth’s resources and its capacity to absorb waste were inexhaustible—and improve them to create more modern assessments, appropriate for the 21st century and beyond. A new report from our colleagues at Farmland LP, Delta Institute and Earth Economics will make a major contribution toward this end.

Healthy Farmland Vision – Click the graphic for an interactive web feature.

Economists view agriculture as a primary sector of the economy, meaning that without the activity of that sector, the remainder of the economy (such as manufacturing and service) could not be developed. Together with other primary economic enterprises such as mining and forestry, agriculture has generally been practiced and acknowledged as an extractive industry. Whereas mining is visibly extractive, agriculture is less so, because degradative processes such as soil erosion, fertility loss, and water and air pollution are not as obvious as mountaintop removal and strip mining. Yet, as practiced industrially, agriculture is both extractive and more extensive than mining.

 

Source: Our World in Data.

Extractive agricultural practices are abetted by strategies such as importing nutrients to compensate for loss of native soil fertility and by the fact that we value the gains from the extraction but don’t discount the losses. For example, we measure crop and animal yield and translate that to sales and profit, but don’t subtract from the ledger the soil, nutrients, air and water quality lost to produce crops and livestock. One superficial reason for this is that we don’t know the “cost” of those resources, but that is simply a polite way to say that historically we don’t value them. This is a perfect example of the nostrum that we measure what we care about and care about what we measure.

Yet, agriculture need not be inherently extractive. Through practices that build soil, recycle nutrients and store water it can become a regenerative system while still providing abundant food and other agricultural products. A key to shift from extractive to regenerative mode is to build a more complete picture of the total benefits and costs associated with agricultural management. For nearly a decade, the investment firm Farmland LP has been managing thousands of acres with regenerative techniques, thereby providing an opportunity for scientists and economists to assess the value of these practices to soil, water, climate, energy and social sectors. The Delta Institute and Earth Economics, with grant support from the Department of Agriculture’s Natural Resources Conservation Service, worked with Farmland LP on just such a project.

Based on a comprehensive review of scientific literature examining the value of various ecosystem services, the researchers applied the rigorous methodologies of Ecosystem Services Valuation and Greenhouse Gas Accounting to assess the effects of farm management on items such as soil formation and quality, water capture and quality, pollination and seed dispersal, climate stability, disaster risk reduction, air quality and biological control. Using Colorado State University’s COMET-Farm model, and the USDA’s Revised Universal Soil Los Equation, the researchers evaluated the effect of regenerative techniques on farmed and non-farmed land under Farmland LP’s management. They compared these model outputs with those from land managed conventionally to construct a comprehensive impact balance sheet.

The sums cited in this report are astounding, ascending into the millions of dollars of added ecological value from regenerative process—against millions of dollars of ecological losses due to standard industrial practices. The practices Farmland LP implements are well-known, backed by science and practice, and accessible to all farmers and farm managers with an interest in managing whole systems to increase returns to management. Examples include integrated crop and livestock production, crop rotation, biodiverse annual and perennial mixes, stream buffers, grassed waterways, organic fertilizers, biological pest control and uncultivated land to provide ecological services (erosion control, water capture, habitat and refugia for beneficial organisms.) The combination of these regenerative methods generated net value while industrial methods destroyed value—all while performing comparably on the dominant indicator of agricultural yield.

Ecological Service Value of farmed and non-farmed areas by impact metric – Delta Institute (see report for methods, context and further data.)

This assessment affirms the concrete value and effectiveness of multifunctional regenerative approaches. Since many of these ecosystem services are not currently quantified—much less traded—on markets that would remunerate farmers, the benefits are primarily experienced by way of cleaner environment, lower costs of production and added value of agricultural land. This is because land managed with regenerative practices will produce bountifully, at lower cost and for an indeterminate period of time, whereas the value of industrially managed land depends on false and brittle economies, such as access to government subsidies and the availability of cheap industrial fertilizer.

In fact, the main business of Farmland LP, a real estate investment trust, is to add long-term value to agricultural land for landowners and investors. A remarkable aspect of this strategy and business model, in addition to more faithfully reflecting actual ecological economics, is how quickly Farmland LP management has been able to produce results. In addition to demonstrating the effectiveness of regenerative methods, these findings indicate the kinds of practices that should be more broadly adopted across all of agriculture to assure our livelihood at present and far into the future.

The skilled agronomists and farm managers at Farmland LP, together with the rigorous scientists and economists who have developed and used the ecosystem evaluation technique, are demonstrating that regenerative agriculture is not an aspirational figment. It is real, it is possible, it is productive, it is profitable and it is environmentally beneficial. These things can all exist with one another. A successful business model is predicated on this. As long as reliable scientific information influences decisions and behavior, this report provides a beacon toward more viable, ethical and realistic agricultural practice for the long term.

Photo: Farmland LP Graphic: Our World In Data.

The Department of Interior Does Not Care What You Think About Endangered Species

Photo: NCinDC/CC BY-ND 2.0 (Flickr)

The Department of Interior simultaneously announced three majorly flawed proposals that would radically transform how the Endangered Species Act functions and gave the public just 60 days to provide feedback. Yesterday, without providing any reasoning, the department denied a request from UCS to extend the comment period. That means you have six more days to file a comment (Rule 1, Rule 2, Rule 3). This guide from UCS can help you craft an effective comment on one or all of these rules.

The way the government uses science to manage endangered species like the greater sage grouse may change significantly for the worse under a series of new proposals. Photo: USFWS

Public comments are more than just hyperbole. As part of the official record, they can be used by organizations that wish to challenge the government’s interpretation of a law in court. They are sometimes used by members of Congress conducting oversight over the work of federal agencies. And they can guide future administrations in how they carry out environmental and public health laws.

In a letter requesting a comment period extension, I noted the following:

These proposals could profoundly change the implementation of the Endangered Species Act and the public, including the scientific community, needs sufficient time to better evaluate the impacts of the proposed rule in conjunction with the other two administrative proposals to provide comprehensive and meaningful feedback on it…

Given the critical and comprehensive nature of this proposal, the current timeframe is wholly inadequate and will not allow for thorough public input on these proposed rules and their impact on FWS’s ability to fulfill its mission to conserve, protect and enhance fish, wildlife and plants and their habitats for the continuing benefit of the American people. 

When the EPA tried a similar stunt with its proposal to restrict the use of public health science in its work, the agency ultimately agreed to extend the public comment period by more than two months. By this measure, then, the Department of Interior is even worse than Scott Pruitt’s EPA.

Still, we must work with the limited democratic tools we have left. 1500 scientists signed a letter earlier this year that helped inform public understanding of current threats to endangered species by Congress and the Trump administration. Keep that momentum going by submitting a public comment by September 24, 2018. You can also sign on to a more general comment developed by UCS before the deadline.

Photo: NCinDC/CC BY-ND 2.0 (Flickr)

Mass. Gas Explosions: What Can We Do About Home Fossil Fuels?

The calls and texts from my kids’ school started coming in at 5:11 p.m. last Thursday: “Evacuate campus buildings immediately.” Some of the messages included mention of a gas leak. The northern Massachusetts headlines about gas leaks, fires, and explosions were scary, and this was my own family potentially in harm’s way.

After events like that, it’s easy to imagine wanting to be done with fossil fuels. Not just because of their climate change, broader environmental, or public health impacts, but also because of the problems, even rare ones, that can arise from having those fuels right where we live.

But where might that fossil fuel reduction plan happen on the home front? Here are a few ideas.

Getting beyond fossils

Photo: Pixabay/Magnascan

Even with all the thinking I do about moving away from fossil fuels and toward clean energy, those messages from school brought an immediacy to the need for transition that I hadn’t felt before. When my family got past the emergency stage, I found that last week’s events were prompting me to rethink the role that fossil fuels play in my own life, knocking out of me the remnants of my that’s-the-way-it-is-because-that’s-the-way-it’s-always-been mindset.

The first part for addressing our home fossils is understanding where they are (in fuel form in this case—not, say, as plastic plates, polyester quick-dry clothing, or vinyl siding). The second part is understanding the options for dealing with them.

In my house, that first part comes down to space heating, water heating, cooking, and transportation. No small list. For the second part, though, the catalogue of options is definitely up to the challenge.

Here’s a look at the opportunities in each of those areas, and where my own journey stands.

Cutting fossil use in space heating

Our winters are cold, and in New England, space heating accounts for 60% of household energy use. While natural gas is the fuel for more than half, homes in these parts also use fuel oil in a big way; in Massachusetts, heating oil accounts for close to 30%.

Moving from oil to gas cuts down on carbon pollution, using high-efficiency gas furnaces and boilers takes it to the next level, and insulating homes better can cut down on any fuel. But none of those results in ditching on-site fossils altogether.

Fortunately, heat pumps do. The two options are ground-source/geothermal, which take advantage of the constant temperature underground, and air-source, which miraculously harvest heat from even really cold air (and have gotten good in recent years at handling frigid northern temperatures). And they both do it with electricity as the only input.



I haven’t gotten to that stage yet. After becoming a homeowner a while back, I upgraded my heating equipment to the highest-efficiency units I could find. But heat pumps weren’t really on my radar screen. So there’s room for progress there.

Cutting fossil use in water heating

The next big category for fossil use in our houses is water heating; it accounts for 16% of home energy use in Massachusetts. Efficiency is an opportunity here, too, but again, only a partial solution if it’s a natural gas- or oil-fired unit.

The options for fossil freedom lie in electricity and the sun. For our home, I put in a solar water heating system, with a backup to boost it as needed. That booster is gas-fired, but could have been electric.

Another option is electric with, again, heat pumps to the rescue. Like their space-heating brethren, heat-pump water heaters draw heat from their surroundings. In this case, that adds up to water getting heated two to three times as efficiently as it would with a conventional electric (resistance) water heater.

My solar heat as of Monday. No sense wasting all that sunshine.

Cutting fossil use in cooking

Most of the time these days, the choice for frying eggs or roasting potatoes is between gas and electricity. Gas devotees like its responsiveness (though electrics may often actually have the edge in performance).

When we updated our old kitchen a few years back, we switched from gas to electric, but not a standard one. Instead of a resistance (glowing coil) kind, we went with an induction cooktop—electric, efficient, and really responsive (and able to boil water in no time flat).

Cutting fossil use in transportation

Our transit of the house in search of fossil fuels shouldn’t ignore the garage, and gasoline. The obvious solution is electric vehicles, and it’s an option that’s so much more real than it was when I drove EVs back in the 1990s. Add in walking, biking, and electric buses, and you’re cruising without carbon (onsite).

My ride, when I’m not on my bike or a train, is efficient (a 2001 first-gen hybrid, still, at 196,000 miles, getting 45 miles to the gallon), but still gas-powered. My wife’s, though, is pure electric—not a gas gallon in sight.

Home fossil use beyond the home

There’s actually one more entry for this list: electricity. This might seem like an odd thing to discuss when we’re talking about fossil fuels in the home, but let’s face it: A lot of the approaches above involve switching to a plug, and we’re not necessarily interested in just exporting our fossil problem with the out-of-sight-out-of-mind approach.

Fortunately, there are options here, too. One is to make your own fossil-fuel power, particularly with solar electric (photovoltaic) panels. If you’ve got the wherewithal and the roof, for example, you can look into putting up a solar array (and maybe even adding batteries). Or you can see about joining in with neighbors in a community solar system.

A more broadly available fossil fix is to buy green power. If your utility gives you the option, you can choose a fossil-free mix in place of whatever default electric mix might otherwise supply you. Or you can buy an equivalent amount of renewable energy credits (RECs) to green up your power supply.

We went solar two years ago, and generate enough to cover all of our home electricity use and a portion of the car. For the rest of our usage, my utility had been offering a REC option, and I’d been a loyal customer till that program went away; I’ll be looking to find a successor option.

Continue the journey

Home sweet lower-fossil-fuel home (Photo: J. Rogers)

Not all of these opportunities are available to all of us (think renters, for example). Money, too, is a consideration, and not all the options above are cheap (though some can actually save you money). But times like these call for do-what-you-can and beyond-the-wallet thinking. (Including because of the expense of the investments that someone is going to have to make, in the case of these events, to improve safety even without nixing the fossil fuels.)

Last Thursday, many of us in my area were lucky. My boys and their schoolmates evacuated and waited it out in a field. We picked up our kids, and adopted for the night a couple of extra who were more affected by the gas fires. Our town is supplied by a different gas network from the gas-fire communities, and we have a different power company, so didn’t lose power when service got cut off in affected communities.

But fossil fuels are certainly a part of our lives as much as they were for those who got hit by last week’s events: We have natural gas in our home, gasoline in our garage, and neighbors who heat with oil. So the journey continues.

When it comes to fossil fuel use in our homes, we’ve got options, and plenty of reasons to exercise them. Fossil fuels’ days of fossil fuels are numbered. Accelerating that phase-out is in our hands… and looking better all the time.

What’s for Dinner? A Preview of the People, Process, and Politics Updating Federal Dietary Guidelines

Photo: grobery/CC BY SA 2.0 (Flickr)

Months behind schedule, two federal departments have officially kicked off the process for writing the 2020-2025 iteration of the Dietary Guidelines for Americans. Updated and reissued every five years, these guidelines are the nation’s most comprehensive and authoritative set of nutrition recommendations. And although the process is meant to be science-based and support population health—and has historically done so, with some notable exceptions—there are plenty of reasons to believe that the Trump administration is preparing to pitch a few curveballs.

First, a little background: The two agencies responsible for issuing the guidelines are the US Department of Agriculture (USDA) and Department of Health and Human Services (HHS). Earlier this month, the agencies released a call for nominations to the advisory committee that will review current nutrition science and write recommendations for the new guidelines. For the first time, the guidelines will include recommendations for maternal nutrition and for infants and toddlers through 24 months—meaning we may see a larger advisory committee and some extra work put into developing these recommendations from scratch.

And that won’t be the only change since the last cycle. There was a bitter political battle over the 2015-2020 Dietary Guidelines, in which the advisory committee made mention of environmental sustainability, noting that plant-based diets that include plenty of foods like fruits, vegetables, and whole grains are good for both our health and the future of our food supply. These recommendations were ultimately omitted, and the episode culminated in Congress writing new legislation to limit the scope of the guidelines and mandate a so-called critical review of their scientific integrity. The full impact of this anti-science legislation, which was tacked onto a 2016 appropriations bill (despite strong opposition from public health and nutrition groups), will be brought to bear during the coming months.

All that said, there’s one thing that’s likely to remain the same: the industries that wielded influence over the 2015-2020 Guidelines haven’t gone anywhere. On the contrary, they may be emboldened by an administration that has repeatedly given preference to corporate interests, sidelining science and sacrificing the public good in the process.

The People: What will become of the Scientific Advisory Committee in the Trump era?

Typically, the first major step in developing new Dietary Guidelines is to identify the group of nutrition and health experts who will form the Dietary Guidelines Advisory Committee (or DGAC). These nominees will be well-known in their fields, and will bring with them more than a decade each of experience as medical or nutrition researchers, academics, and practitioners. Members of the DGAC serve the committee for two years, after which they submit a final scientific report to the USDA and HHS with their recommendations.

This part of the process is happening in real-time. The 30-day call for nominations is now open and will close on October 6. (Read more about the criteria for nominees here.)

Photo: USDA

But the negligence the Trump administration has shown in maintaining existing scientific advisory committees is concerning, to say the least. An analysis by my colleagues here at the Union of Concerned Scientists shows that, during the administration’s first year in office, federal science advisory committees met less frequently than in any other year since 1997, when the government began tracking this data. A majority of the committees are meeting less than their charters require, and committee membership has also decreased—with some agencies disbanding entire advisory committees altogether.

Furthermore, what happens after the public submits nominations to the DGAC happens largely behind closed doors. Nominations will be reviewed by USDA and HHS program staff, and the slate of chosen nominees will be evaluated and vetted internally. Formal recommendations for the committee will then be reviewed and approved by the USDA and HHS secretaries. Per their most recent communication, the agencies hope to announce the 2020-2025 DGAC by early next year.

If you’re thinking that the committee selection lacks a certain element of transparency, you’re not the only one.

In one of two reports released last year examining the Dietary Guidelines process (the result of the aforementioned legislation, passed in 2016 appropriations rider), the National Academy of Medicine recommended that the public have the opportunity to review the provisional committee for bias and conflicts of interest before it’s approved.

It’s worth repeating that the selection of committees in recent DGA cycles has successfully brought a wealth of knowledge and expertise to the process—resulting, for the most part, in strong evidence-based recommendations. But in an administration where the “D” in USDA has come to stand for DowDuPont, concerns about undue influence on the committee selection may be well warranted. (See “The Politics” below.)

The Process: More to do, and twice as fast

After the advisory committee is appointed, the committee begins to review the current body of nutritional science to generate its recommendations. The recommendations are based on a “preponderance of scientific evidence,” which means they consider a variety of research and study designs. (Though randomized controlled trials are typically the gold standard in science, this type of study is incredibly difficult to do with diet.)

The committee won’t review everything—there are certain topics that are selected each cycle, based on what new evidence has emerged and what issues are of greatest concern to public health. And here’s the first place you’ll see the 2020-2025 DGAs break from tradition: rather than identifying topics of interest after the committee is selected, USDA and HHS have developed a list of topics first, soliciting public comments in the process. You can read their list here.

There are immediate glaring absences in the topic list, including fruits, vegetables, and whole grains—some of the staples of what we consider a healthy diet. This may just mean that the committee won’t be revisiting these topics, and will instead default to existing recommendations—but the lack of clarity here is disconcerting. A brief note at the end of the topic list, perhaps meant to explain the omissions, has left public health and nutrition groups scratching their heads: “Some topics are not included above because they are addressed in existing evidence-based Federal guidance. In an effort to avoid duplication with other Federal efforts, it is expected that these topics will be reflected in the 2020-2025 Dietary Guidelines by referencing the existing guidance. Thus, these topics do not require a review of the evidence by the 2020 Dietary Guidelines Advisory Committee.”

Photo: USDA

Meanwhile, the topics that have been explicitly named include added sugars; beverages, such as dairy, sugar-sweetened beverages, and alcohol; the relationship between certain diets (think: Mediterranean Diet, vegetarian, etc.) and chronic disease; and different dietary patterns across life stages, including infancy and toddlers through 24 months. What didn’t make the cut? A mention of red meat or processed meats—which have been linked to certain types of cancer and other health risks. The agencies (predictably) sidestepped this issue, making reference only to types of dietary fats.

If this sounds like a lot to sort through, it will be. And the tentative timeline that the agencies have proposed is ambitious. After the committee is announced in early 2019, it will have just over one year to deliberate before releasing its scientific report. During that time, the committee will hold approximately five public meetings (last cycle, there were seven) and offer an extended period of open public comment. After the DGAC scientific report is released, the public will also have one final opportunity to comment.

But if there’s anything we learned from the last DGA cycle, it’s that what can happen during that gap—between the release of the DGAC scientific report and the issuance of the DGAs—is critical, and it isn’t always clear. Enter “The Politics.”

The Politics: When money talks

What happened during the 2015-2020 DGA cycle?

The DGAC advisory report, submitted in February 2015, included recommendations for plant-based diets that supported both human health and environmental sustainability—an unprecedented move. Per the report: “A diet higher in plant-based foods, such as vegetables, fruits, whole grains, legumes, nuts, and seeds, and lower in calories and animal-based foods is more health promoting and is associated with less environmental impact than is the current U.S. diet.”

But eight months later, the writing was on the proverbial wall, in the form of a blog written by former USDA Secretary Vilsack and HHS Secretary Burwell. Sustainability is outside the scope of the DGAs and would not be included.

Two months after that, the 2016 appropriations bill was passed, stating that any revisions to the Dietary Guidelines for Americans be limited in scope to nutritional and dietary information.

By all appearances, the key concern seemed to be that science-based sustainability recommendations were outside the scope of the DGAs. But you don’t have to read too far between the lines to see that many were more concerned about sales—as in, sales of foods that aren’t central to a plant-based diet. Like, for example, meat and dairy.

At a Congressional hearing on the matter, Rep. Mike Conaway, current chair of the House Agriculture Committee, put it this way: “[the inclusion of sustainability] could result in misguided recommendations that could have ill effects on consumer habits and agricultural production.”

Rep. Glenn Thompson, current chair of the House Agriculture Subcommittee on Nutrition, put a finer point on his interests: “What can we do to remove policies that hinder milk consumption, and to promote policies that could enhance milk consumption?”

It’s hardly a stretch to imagine that what happened during the 2015-2020 DGA cycle—and to the advisory committee’s recommendations that were seemingly lost in translation—was a direct product of industry influence.

And though efforts to communicate the science behind more sustainable, plant-based diets have been all but stymied, there is still plenty at stake for industry groups in the 2020-2025 DGA cycle. Expect to see some of the usual suspects make an appearance, including the meat industry, dairy industry, and sugar-sweetened beverage associations, as well as formula companies, which will have vested interest in shaping the new recommendations for infants and toddlers. (This may be happening in real-time, too. Just this spring, Gerber announced it would join its parent company, Nestle, at its headquarters in Rosslyn, Virginia—just a stone’s throw from the capitol.)

As this process unfolds, the Union of Concerned Scientists will be there—watchdogging and waiting. Stay tuned to learn more about how you can help us stand up for science and make the 2020-2025 Dietary Guidelines for Americans the strongest, most health-promoting edition yet.

Photo: grobery/CC BY SA 2.0 (Flickr)

Even More Than 100% Clean: California’s Audacious Net-Zero Carbon Challenge

Governor Brown signing SB-100 into law.Governor Brown signing SB-100 into law. Photo: Governor's Office

At the end of a summer that was marked by dramatically destructive natural disasters, including massive fires throughout the entire western U.S., killer heat waves, fires, and floods in Asia and Europe, and now Hurricane Florence landing on the Carolinas, California is offering a ray of hope for a planet that is facing increasingly terrible impacts from global warming.  Governor Jerry Brown has convened an international climate summit in San Francisco that demonstrates the huge number of jurisdictions both nationally and from around the world, in addition to businesses and industries, religious groups, climate justice advocates, and a lot of scientists, among many others, who are working hard for climate action.

Brown began the week by demonstrating that California is not resting on its impressive climate action laurels but significantly increasing its commitment to reducing emissions. I was lucky enough to attend the ceremony where Governor Brown signed SB 100, a bill that had taken its legislative author, State Senator Kevin de León (whose legislative tenure has been distinguished by successfully championing historic clean energy and climate action), a grueling two years to pass the state legislature.  SB 100 commits California to 60% renewable energy by 2030 (up from the current 50% requirement) and a goal of fully 100% clean electricity by 2045.  While we have much work to do to achieve this goal, we are now committed to a path toward a fully decarbonized electricity system.  I was proud to represent UCS’s incredible staff who made uniquely valuable contributions to this coalition effort to get the bill passed.

In a remarkable piece by NPR, UCS’s  energy analyst Laura Wisland talked about the many challenges that remain for California to achieve this goal, but it is clearly something we can do.  Thanks to over 15 years of previous renewable electricity and energy efficiency policies, electricity emissions now account for a relatively low 16% of California’s greenhouse gas (GHG) inventory. But the last emissions reductions will be the hardest to achieve. We will need to grapple with how to lower the vast amount of natural gas used to generate electricity and make room for cleaner, carbon-free sources of energy.  We also have a big challenge ahead to grapple with transportation and industrial emissions to meet ambitious state 2030  GHG reduction limits.  But in the last twelve years California has shown that it can succeed- ahead of schedule- in meeting carbon reduction goals while growing the state economy from the eighth to the fifth largest in the world, a feat that belies alarmists who say that reducing emissions will damage the  economy.

A surprise announcement of huge ambition

Coming at the end of Governor Brown’s remarkable 8-year tenure, the bill signing ceremony this week contained a surprise. With no fanfare or previous signaling of his intentions, Governor Brown included an additional action in this week’s bill signing – a new executive order, B-55-18, that creates an economy-wide carbon neutrality goal for California by 2045.  He is directing the state to strive for net zero carbon emissions in less than 30 years. This must be done using a combination of zero-emission technologies to power the electric grid, transportation, homes, buildings and industries,  with other practices and technologies that sequester carbon, or take it out of the atmosphere.

This will require an extraordinary effort that will affect every Californian. The state will not only have to meet its ambitious new 100% clean electricity goal on top of its very ambitious 2030 statewide GHG reduction limit, but also virtually halt nearly all emissions in a mere 15 years. Let’s take a moment to appreciate that goal.

For California to achieve net zero carbon emissions, it will require a staggering change to some of the basic elements driving our economy. California will need to eliminate the single biggest tranche of carbon emissions: those from the vehicles that transport people and goods across throughout the state.  This means electric vehicles powered by our clean electric grid for nearly everyone. It will require carbon-free fuels for industrial processes, hugely advanced efficiency in buildings and appliances, and likely the development of truly reliable forms of storing carbon in plants, soils, and geographic formations. As yet, no one has developed a real plan for how we could get to net zero emissions, and to do this successfully is a very tall order. David Roberts at Vox wrote this early analysis of the ins and outs of what could happen, including a warning that implementing net zero could include measures and policies that could be more symbolic than substantive.

But here’s the thing. Once the Governor of the nation’s most populous state, a serious and credible leader with a remarkably successful tenure over eight years, has made carbon neutrality the goal, then a lot of people may start take it seriously enough to figure out how to meet it. What Brown has done is to challenge us to think about what it would really take to get carbon emissions down fast enough and thoroughly enough to reduce the risks we are seeing multiply so rapidly, and help ensure a viable future, within three decades.  It is a huge undertaking.

A vision for the future, both audacious and necessary

A few cynics may argue that this is a non-binding announcement to help the Governor’s visibility for his San Francisco Global Climate Action Summit that is happening this week, but I believe that this order could have tremendous value. Executive Orders in California are part of state policy, even if they do not have the force of law. And whatever mix of reasons Brown had for doing this, he is sending a strong science-based message to the world –namely, that we need to reduce global warming pollution much further and faster than we previously thought to avoid the worst impacts of climate change. It will be up to the next governor and the legislature to carry out Brown’s order. Luckily, California has good examples of turning Executive Orders into law.

So here is my unsolicited advice to California’s next Governor and the Legislature starting in 2019: 1) do the hard work of ensuring we get  to 100% clean energy by 2045 ; 2) start now on fully implementing  new regulations and laws that will rapidly take the carbon out of our transportation and industrial sectors to ensure we meet our 2030 goals to reduce ghgs to 40% below 1990 levels; 3) get our best minds in science and technology to work together to produce an economically sound blueprint that would get us to net zero by 2045; and 4) start to implement the next generation of policies that would get us to net zero.

California has shown that a world-class economy can reduce its carbon emissions rapidly while growing its economy. While we can’t guarantee that further decarbonization will go as smoothly, the on-going, accelerating, costly, and deadly destabilization of our climate is much too urgent a matter to approach timidly because of our fears. We owe a great deal to the leadership of people like State Senator Kevin De León and Governor Jerry Brown who have shown what is possible.  Neither will be serving as leaders in Sacramento after this year. It is now up to us to take their bold leadership and ensure that we succeed in making their visions real, and help provide examples and lessons to the nation and the world on how it can be done.

The EPA Can’t Stop Polluters When the Trump Administration Cuts Enforcement Staff

The primary task of the US Environmental Protection Agency is to protect public health and the environment. To do so, the agency must ensure that everyone, whether in the private sector or in government, complies with our nation’s laws and regulations. These safeguards are in place to protect health and safety for everyone anywhere in the country. Their enforcement safeguards are also a matter of fairness—all entities that might adversely impact our health and environment are supposed to follow the rules. So, it is particularly disturbing that the EPA Office of Enforcement and Compliance Assurance (OECA) has taken a major hit in staffing over the past 19 months in the Trump Administration.

Here at UCS, we filed a Freedom of Information Act request to help us identify changes in the number of EPA staff working on enforcement and compliance. It took a while to get the answer, but the overall results are even worse that we suspected: In EPA headquarters, at least 73 OECA staff left the office and only 4 were hired between the start of the Trump administration and late July 2018. Among those 73 departures were 17 environmental protection specialists and at least 10 scientists or engineers. The scant hires include Assistant Administrator Susan Bodine, a former lobbyist, and Deputy Assistant Administrator Patrick Traylor, a lawyer who previously defended the Koch brothers, among other industry clients.

EPA has also lost further enforcement staff (not included in the OECA list) at the regional level. Region 5, for example, lost five employees in their enforcement support section, including three investigators, while Region 7 lost several employees in its enforcement coordination office.

Those departure numbers are BIG. It means that many fewer people are out there assuring that pollution and polluters are monitored and living up to their responsibilities under the law. In addition to reductions in staff focused on pollution prevention, it also means reductions in staff for those who work on environmental cleanup, such as at Superfund sites.

There is also a critically low number of criminal investigators working for the EPA. Even though the law requires the agency to have at least 200 “special agents,” there are only 140 on staff, according to Public Employees for Environmental Responsibility.

This goes along with big reductions in EPA staff across all offices as reported in the Washington Post over this past weekend. According to the Post, at least 1,600 staff have left EPA since January 2017, with fewer than 400 new employees hired. UCS’s own data shows that at least 670 of the losses have been at the 10 regional offices (with just 73 hires to offset these losses). Notably, when EPA has been hiring, they generally haven’t been hiring scientists. According to a December 2017 New York Times piece, the administrator’s office was the only unit to have more hires than departures that year, adding 73 new employees despite the departures of only 53 staff.

My colleague Kathleen Rest and I warned about the dangers of such staff attrition at the beginning of the year. In both those articles, there were warnings from former EPA staff, aligning with our own government experience, that cutting off new hiring sends all the wrong signals to young professionals about opportunities to spend part of their careers in public service—while also threatening the capacity of our federal agencies to address current and future risks. From the statistics we obtained, even the number of student trainees in the regional offices has been slashed, with only five hired (all in Region 5, the upper Midwest) but 48 lost from the other regional offices.

Our survey of federal scientists confirmed that morale is desperately low, and that many offices no longer feel they have the staff to do their jobs.

These staff reductions fly in the face of Congressional action that appropriated funds for the EPA to maintain these programs. Indeed, it seems that around the country, the Trump Administration has gone ahead and made cutbacks not only in enforcement, but also in areas such as the Chesapeake Bay Program, the Great Lakes Program, the Gulf of Mexico program, and others—despite the fact that Congress provided the funding for them.

What does all this mean? It means the Environmental Protection Agency has taken a step backwards on protecting the health and environment for workers, communities, families, for you. New agency directions call for changing the priorities for enforcement and for dropping a special focus on oil and gas extraction and concentrated animal feeding operations, because the agency says issues with these industries have been largely resolved. Seriously. That will be news to neighbors of drilling, pipelines and animal waste disposal sites.

And the leadership of the EPA wants to turn away from enforcement overall, to encouraging compliance through voluntary measures and “compliance assistance”. Pardon our skepticism here.

And so it goes with what seems to be an ongoing industry takeover of our premier public health agency. First they roll back regulations, then they roll back enforcement so there are fewer consequences for those who put the public’s health at risk, and then they reduce the professional staff so new rules can’t be put back in place.

It is time to stand up and stay STOP. Enough is enough. We need the EPA. And that means we need it to be a vibrant, well-staffed, professional agency. Not a political punching bag or a pinata of goodies for the regulated industry.

 

 

Photo: EPA

Mr President, More Than 3,000 Deaths is Not an “Incredible, Unsung Success”

Photo: Juan Declet-Barreto

Last year, I thought throwing rolls of paper towels at victims of Hurricane María in Puerto Rico was the lowest that President Trump could go in disrespecting and failing the people of Puerto Rico in the midst of the climatic catastrophe that was personal to me and my family on the island.

But this morning he went even lower with his tweets denying the death toll from Hurricane María in Puerto Rico, adding insult to injury to an enormous disaster exacerbated by a failure to prepare and to help the island recover. The day before that, in his characteristic self-congratulatory tone, he touted his administration’s handling of Hurricane María an ”incredible, unsung success”.  The dead don’t sing Mr. President. The aftermath of the storm left Puerto Rico without power for months, unleashed a humanitarian crisis for more than 3 million US citizens, and was responsible for more than 3,000 deaths.

Once again, Mr. Trump has shown callous disregard for human life, minimizing the toll of human suffering during and after the hurricane. The President’s falsehoods are rebutted by scientific reports that present evidence that lack of electricity to power hospitals, medical equipment, and refrigerate insulin, combined with a collapsed public health system and inadequate protocols to ascribe deaths to post-hurricane conditions, contributed to the estimates that more than 3,000 Puerto Ricans lost their lives because of Hurricane María (I previously reported on that here).

Besides the President’s statements and tweets being an affront to human dignity, they also shows a callous disregard for the truth and the importance of accurate information in our democracy. Although this administration has a pattern of attacking or ignoring science since day one, this is one of the most extreme examples – we can’t expect to be able to solve problems if our leaders choose to deny facts and attack the evidence.

Sobering fact: Just two days ago, a whopping nine simultaneous tropical storms were shown in a satellite image composite. As my colleague Kristy Dahl recently reported, we already know that the expected storm surge from storms like Hurricane Florence will be amplified because of sea-level rise, that our atmosphere can hold more moisture, and that the potential for extreme rainfall during hurricanes is increased. It is also expected that coastal, rural, and low-income communities in the Carolinas and Virginia will be among the hardest hit by Florence.

I watch with worry as  Florence continues to barrel towards the US Southeastern coast and Typhoon Mangkhut in the Pacific threatens 10 million in the Philippines. I wonder how the President’s tweeted falsehoods will impact our federal agencies’ capacity to respond to what is likely a disastrous situation for millions of Americans. María taught us many things, among them that when the President uses Twitter to minimize disasters, ignore science, and disparage people in harm’s way, this effectively lower the urgency with which federal agencies will respond to such disasters.

As a new hurricane season threatens the US, we need to be able to trust that the government is willing and able to acknowledge the facts and act to protect us. President Trump’s offensive and false tweets today undermine public trust.

 

Juan Declet-Barreto

The Price of Large-Scale Solar Keeps Dropping

Photo: NREL

The latest annual report on large-scale solar in the U.S. shows that prices continue to drop. Solar keeps becoming more irresistible.

The report, from Lawrence Berkeley National Laboratory (LBNL) and the US Department of Energy’s Solar Energy Technologies Office, is the sixth annual release about the progress of “utility-scale” solar. For these purposes, they generally define “utility-scale” as at least 5 megawatts (three orders of magnitude larger than a typical residential rooftop solar system). And “solar” means mostly photovoltaic (PV), not concentrating solar power (CSP), since PV is where most of the action is these days.

Here’s what the spread of large-scale solar looks like:

Source: Bolinger and Seel, LBNL, 2018

In all, 33 states had solar in the 5-MW-and-up range in 2017—four more than had it at the end of 2016. [For a cool look at how that map has changed over time, 2010 to 2017, check out this LBNL graphic on PV additions.]

Watch for falling prices

Fueling—and being fueled by—that growth are the reductions in costs for large-scale projects. Here’s a look at power purchase agreements (PPAs), long-term agreements for selling/buying power from particular projects, over the last dozen years:

Source: Bolinger and Seel, LBNL, 2018

And here’s a zoom-in on the last few years, broken out by region:

Source: Bolinger and Seel, LBNL, 2018

While those graphs show single, “levelized” prices, PPAs are long-term agreements, and what happens over the terms of the agreements is worth considering. One of the great things about solar and other fuel-free electricity options is that developers can have a really good long-term perspective on future costs: no fuel = no fuel-induced cost variability. That means they can offer steady prices out as far as the customer eye can see.

And, says LBNL, solar developers have indeed done that:

Roughly two-thirds of the contracts in the PPA sample feature pricing that does not escalate in nominal dollars over the life of the contract—which means that pricing actually declines over time in real dollar terms.

Imagine that: cheaper over time. Trying that with a natural gas power plant would be a good way to end up on the losing side of the contract—or to never get the project financed in the first place.

Here’s what that fuel-free solar steadiness can get you over time, in real terms:

Source: Bolinger and Seel, LBNL, 2018

What’s behind the PPA prices

So where might those PPA price trends be coming from? Here are some of the factors to consider:

Equipment costs. Solar equipment costs less than it used to—a lot less. PPAs are expressed in cost per unit of electricity (dollars per megawatt-hour, or MWh, say), but solar panels are sold based on cost per unit of capacity ($ per watt). And that particular measure for project prices as a whole also shows impressive progress. Prices dropped 15% just from 2016 to 2017, and were down 60% from 2010 levels.

Source: Bolinger and Seel, LBNL, 2018

The federal investment tax credit (30%) is a factor in how cheap solar is, and has helped propel the incredible increases in scale that have helped bring down costs. But since that ITC has been in the picture over that whole period, it’s not directly a factor in the price drop.

Project economies of scale. Bigger projects should be cheaper, right? Surprisingly, LBNL’s analysis suggests that, even if projects are getting larger (which isn’t clear from the data), economies of scale aren’t a big factor, once you get above a certain size. Permitting and other challenges at the larger scale, they suggest, “may outweigh any benefits from economies of scale in terms of the effect on the PPA price.”

Solar resource. Having more of the solar happen in sunnier places would explain the price drop—more sun means more electrons per solar panel—but sunnier climes are not where large-scale solar’s growth has taken it. While a lot of the growth has been in California and the Southwest, LBNL says, “large-scale PV projects have been increasingly deployed in less-sunny areas as well.” In fact:

In 2017, for the first time in the history of the U.S. market, the rest of the country (outside of California and the Southwest) accounted for the lion’s share—70%—of all new utility-scale PV capacity additions.

The Southeast, though late to the solar party, has embraced it in a big way, and accounted for 40% of new large-scale solar in 2017. Texas solar was another 17%.

But Idaho and Oregon were also notable, and Michigan was one of the four new states (along with Mississippi, Missouri, and Oklahoma) in the large-scale solar club. (And, as a former resident of the great state of Michigan, I can attest that the skies aren’t always blue there—even if it actually has more solar power ability than you might think.)

Capacity factors. More sun isn’t the only way to get more electrons. Projects these days are increasingly likely to use solar trackers, which let the solar panels tilt face the sun directly over the course of the day; 80% of the new capacity in 2017 used tracking, says LBNL. Thanks to those trackers, capacity factors themselves have remained steady in recent years even with the growth in less-sunny locales.

What to watch for

This report looks at large-scale solar’s progress through the early part of 2018. But here are a few things to consider as we travel through the rest of 2018, and beyond:

  • The Trump solar tariffs, which could be expected to raise costs for solar developers, wouldn’t have kicked in in time to show up in this analysis (though anticipation of presidential action did stir things up even before the tariff hammer came down). Whether that signal will clearly show in later data will depend on how much solar product got into the U.S. ahead of the tariffs. Some changes in China’s solar policies are likely to depress panel prices, too.
  • The wholesale value of large-scale solar declines as more solar comes online in a given region (a lot of solar in the middle of the day means each MWh isn’t worth as much). That’s mostly an issue only in California at this point, but something to watch as other states get up to high levels of solar penetration.
  • The investment tax credit, because of a 2015 extension and some favorable IRS guidance, will be available to most projects that get installed by 2023 (even with a scheduled phase-down). Even then it’ll drop down to 10% for large-scale projects, not go away completely.
  • Then there’s energy storage. While the new report doesn’t focus on the solar+storage approach, that second graphic above handily points out the contracts that include batteries. And the authors note that adding batteries doesn’t knock things completely out of whack (“The incremental cost of storage does not seem prohibitive.”).

And, if my math is correct, having 33 states with large-scale solar leaves 17 without. So another thing to watch is who’s next, and where else growth will happen.

Many of the missing states are in the Great Plains, where the wind resource means customers have another fabulous renewable energy option to draw on. But solar makes a great complement to wind. And the wind-related tax credit is phasing out more quickly than the solar ITC, meaning the relative economics will shift in solar’s favor.

Meanwhile, play around with the visualizations connected with the new release (available at the bottom of the report’s landing page), on solar capacity, generation, prices, and more, and revel in solar’s progress.

Large-scale solar is an increasingly important piece of how we’re decarbonizing our economy, and the information in this new report is a solid testament to that piece of the clean energy revolution.

Photo: NREL

Pages