UCS Blog - The Equation (text only)

A Toxic Nomination Hangs in the Balance: Who Will Stand Up to Finally Topple Michael Dourson?

Photo: www.epw.senate.gov

Three weeks ago, North Carolina’s Republican senators, Richard Burr and Thom Tillis, announced their opposition to the nomination of Michael Dourson to run the office of chemical safety in the Environmental Protection Agency. Only one more vote is needed to doom his nomination, assuming unified opposition from all 48 Democrats and Independents.

The question is, who will have the courage to step forward next?

It should take no courage at all, if science and public health matter. Dourson is already in the EPA, serving as an adviser to Administrator Scott Pruitt. But, given Dourson’s outrageous record of working to undermine science-based standards for toxic chemicals on behalf of the chemical industry, he is clearly unfit to lead the office overseeing chemical safety at the federal level.

Belittling the health effects of dangerous chemicals

Dourson’s private research firm has represented companies such as Dow, Monsanto, and PPG Industries, and has had some research funded by Koch Industries.

Michael Dourson helped set a state safety standard for the chemical PFOA 2,000 times less strict than the level deemed safe by the EPA. Photo: pfoaprojectny

He and his firm have routinely judged chemicals to be safe at levels hundreds of times greater than the current standards issued by the EPA. Among those chemicals whose health effects he has tried to belittle is perfluorooctanoic acid (PFOA), which is used in the manufacture of nonstick cookware such as Teflon and stain-resistant household products such as carpets. Dourson helped the state of West Virginia set a safety standard for the chemical 2,000 times less strict than the level deemed safe by the EPA.

That decision alone threatens the health of many Americans. In 2012, research by scientists at Emory University found workers at a West Virginia DuPont PFOA plant were at roughly three times the risk of dying from mesothelioma or chronic kidney disease as other DuPont workers, and faced similarly elevated risks for kidney cancer and other non-cancer kidney diseases. A more recent study, published in the International Journal of Hygiene and Environmental Health linked reductions in exposure to PFOA across the country to a sharp decline in pregnancy-related problems including low-birth-weight babies.

In North Carolina, an as-yet-unregulated chemical meant to replace PFOA as a non-sticking agent, Gen X, has already been found at significant levels in the Cape Fear River. And the state is still reeling from nearly 1 million people being exposed to drinking water at Camp Lejeune that was contaminated with chemicals such as benzene, vinyl chloride, and trichloroethylene (TCE) from the 1950s through the 1980s. The Obama administration established a $2.2 billion disability compensation program for Camp Lejeune veterans suffering from cancer.

Serious concerns from North Carolina

Expecially troubling, if confirmed, Dourson would be responsible for oversight of the 2016 Toxic Substances Control Act. In its final months, the Obama administration selected the first 10 chemicals to be reviewed under the new act for their “potential for high hazard.” Of the 10, Dourson has claimed in research that several were safe at levels far exceeding the science-based standards currently established by the EPA. They include solvents linked to cancer such as 1,4 dioxane, 1-Bromopropane, and TCE, the latter of which has been found in the water contamination at Camp Lejeune.

The Senate Environment and Public Works Committee advanced Dourson’s nomination to the full Senate in late October on a party-line 11-10 vote. But the candidate’s past was too biased for Burr and Tillis, despite the fact that both voted to confirm EPA Administrator Scott Pruitt. Burr said of Dourson in a statement, “With his record and our state’s history of contamination at Camp Lejeune as well as the current Gen X water issues in Wilmington, I am not confident he is the best choice for our country.”

Tillis’s office seconded that with a statement saying, “Senator Tillis still has serious concerns about his record and cannot support his nomination.”

Issues of great importance in Maine

In the immediate aftermath of Burr’s and Tillis’s rejection of Dourson, it seemed that Maine Senator Susan Collins might quickly follow suit. She said, “I certainly share the concerns that have been raised by Senator Burr and Senator Tillis. I think it’s safe to say that I am leaning against him.”

Collins has said nothing since then. Her office did not respond to repeated requests this week from the Union of Concerned Scientists on her latest position. And Dourson’s nomination stands in limbo, presumably as the Republican leadership worries that they may not have the votes in the full Senate to confirm him. In theory, Collins’ concerns should mirror those of Burr and Tillis because Maine has dealt with its share of water and soil pollution at military bases such as the former Brunswick Naval Station and Loring Air Force Base, both Superfund sites. She has also been active in bipartisan efforts to deal with cross-state air pollution.

Collins was the only Republican to vote against Pruitt’s nomination to run the EPA. Pruitt, who repeatedly sued the EPA on behalf of industry as attorney general of Oklahoma, is aggressively attempting to relax chemical regulations and reverse Obama-era rules such as the Clean Power Plan. The EPA has moved to remove products containing PFOA from being studied for lasting impact in the environment and refused to ban the pesticide chlorpyrifos, linked to damaging the developing brains of fetuses and young children.

When she announced her opposition to Pruitt, Collins said, “I have significant concerns that Mr. Pruitt has actively opposed and sued EPA on numerous issues that are of great importance to the state of Maine, including mercury controls for coal-fired power plants and efforts to reduce cross-state air pollution and greenhouse gas emissions. His actions leave me with considerable doubts about whether his vision for the EPA is consistent with the Agency’s critical mission to protect human health and the environment.”

If Collins truly maintains those concerns, she surely would not want to augment the problems of Pruitt’s already disgraceful tenure by supporting Dourson. But even if she for some reason shies away from a no vote, there are many other Republican senators whose states also have military installations with rampant pollution affecting adjacent communities.

Many more Republican senators should be unnerved

With Camp Lejeune as a haunting example of military pollution of its own soldiers and adjacent communities, the US armed forces are in the midst of investigating potential water contamination at nearly 400 such active and shuttered sites. That fact should unnerve many more Republicans, even those who generally support Pruitt’s actions. According to a Politico report three weeks ago, Senators Jeff Flake and John McCain of Arizona, Pat Toomey of Pennsylvania, and Bob Corker of Tennessee were noncommittal about supporting Dourson’s nomination.

Toomey’s office released a statement also reported by the Bucks County Courier Times saying he “remains concerned about the PFOA issue” in towns next to closed military bases in the Philadelphia area, where compounds from firefighting foams may have leached into drinking water sources. Elevated levels of pancreatic cancer have been found in the area.

With so much concern about elevated levels of cancer around the nation linked to water pollution, this is not the time to put someone in charge who made a career out of downplaying the risks of chemicals. It is bad enough that Dourson is already at EPA, advising Pruitt. But that remains a long way from actually having his hand on the pen that can help sign away people’s safety.

He should never hold that pen.

Concerned? 

Call your senator today and ask him or her to oppose the confirmation of Michael Dourson!

The Future of Solar is in the President’s Hands. It *Should* Be an Easy Call

Installing solar panels in PA Photo: used with permission from publicsource.org

The saga of the would-be solar tariffs that just about nobody wants is continuing, and I can’t help but be struck by the disconnect between some of the possible outcomes and the administration’s purported interest in rational energy development for America. If President Trump believes what he says, deciding not to impose major tariffs shouldn’t be a tough decision.

Here’s the thing: in March 2017, the president issued an executive order about “undue burdens on energy development,” which said (emphasis added) that it was:

Solar’s future: Progress or pain? It’s his call.

…in the national interest to promote clean and safe development of our Nation’s vast energy resources, while at the same time avoiding regulatory burdens that unnecessarily encumber energy production, constrain economic growth, and prevent job creation.

Encumbering, constraining, preventing. Remember those verbs as we go through some of the key facts of this case.

The players

The trade case, brought by two US solar panel manufacturers that are on the rocks, or whose foreign parents are, involves a little-used (and failure-prone) provision in the US tax code. And it has met with almost universal rejection, from a whole host of industry, political, security, and conservative and really conservative voices (Sean Hannity, anyone?).

Even the US International Trade Commission (USITC) tasked with making recommendations in response to the petition couldn’t agree, with the four commissioners coming up with three different proposals.

As we said at the time, on the one hand it was good that the USITC recommendations weren’t as drastic as what the petitioners had asked for. On the other hand, anything that slows down our solar progress is bad news for America.

The (pre-Trump) progress

Solar has been on an incredible trajectory for years now, producing energy, cutting pollution, increasing energy security, and helping homes and businesses. The first nine months of 2017, for example, saw solar producing 47% more electricity than in the same period of 2016, with the biggest gains among the top 10 states for solar generation being in Georgia, Texas, and Utah.

Solar has also been an incredible job-creating machine. Some 260,000 people worked in the solar industry by the end of 2016, almost 2.5 times 2011’s solar job count. One in every 50 new American jobs last year was created by the solar industry. And those have been in different pieces of the industry—R&D, manufacturing, sales, project development, finance, installation—and all across the country.

The problem and presaging

Credit: J. Rogers

Some of those gains have taken place during the Trump presidency, and maybe he can rationalize taking credit for them by pointing out the fact that he at least didn’t stop those good things from happening.

That benign neglect may be about to change, though, and we’re already seeing the effects of the uncertainty that the president’s rhetoric around issues of solar and trade has created.

The trade case has continued. While not part of the specified process for this type of proceeding, the White House invited the public to submit comments to the US trade representative, and recently held a public hearing.

The next deadline is January 26, the end of the period for President Trump to make up his mind about the USITC recommendations—accepting one of the sets of proposals, doing something else, or rejecting the idea of tariffs and quotas.

In the meantime, the effects are already hitting: Utility-scale solar costs had dropped below $1 per watt for the first time in history earlier this year. Now those costs have climbed back above that mark as developers have scrambled to get their hands on modules ahead of whatever’s coming.

Large-scale solar projects are faltering (as in Texas) because of the inability of developers and customers to absorb the risk of substantially higher solar costs. That’s investment in projects on American soil, on hold.

But those setbacks could be just a taste of what’s to come.

The point: Encumbering, constraining, preventing

That brings us back to the March executive order, which boldly professed an intention to do away with burdens holding back US industry, and was decided anti-interventionist (in the regulatory sense).

And yet here we are, a few short months later, talking about doing that exact thing—messing with the market, and going against our national interests. Encumbering energy production by driving up the costs of the cells and modules that have powered so much growth. Constraining economic growth by making it harder for American homes and businesses and utilities to say yes to solar. Preventing job creation—even causing job losses—by shrinking the market for what our nation’s vibrant solar industry has been offering so successfully.

Credit: J. Rogers

The pain

While provisions in the tax bill being worked out in congress would do no good for renewables, the president’s actions could have much more direct impacts on American pricing and competitiveness. A lot of smart people are pointing out that any bump-up in US solar module manufacturing jobs will be way more than offset by job losses elsewhere in the industry, including elsewhere in solar manufacturing.

If the president chooses to ignore the many voices clamoring for rational policy on this, if he chooses—and remember he alone can fix this—to impose major tariffs or quotas, he’s going to own their impacts.

Every net American job lost because of higher module prices will have his name on it.

Every US solar panel manufacturer that doesn’t magically take off behind his wall of protectionism will be evidence of the misguideness of his approach.

Every small or large US solar project cancelled—jobs, investments, and all—because of the speedbumps, roadblocks, and hairpin turns on his energy vision-to-nowhere will be a Trump-branded monument to his lack of foresight and unwillingness to accept the changing realities of energy, innovation, and ingenuity.

The path

The solar industry, though, has offered President Trump a way out. They’ve proposed an import licensing fee approach that would support expanded US manufacturing while letting solar continue to soar (all else being equal).

That’s fortunate for the president, and for just about all of the rest of us. Because if he’s truly about unencumbering energy production, about removing constraints to economic growth, and stopping the prevention of job creation, killing American solar jobs would be a funny way to show it.

Public Source

New Transmission Projects Will Unleash Midwestern Wind Power—And Save Billions

As we look ahead to our clean energy future, a key piece of the puzzle is building the transmission system that will carry utility-scale renewable energy from where it’s generated to where it’s consumed. A recent study from the Mid-Continent Independent System Operator (MISO) shows that, when done right, transmission projects integrated with renewable energy can pay huge dividends. They decarbonize our electricity supply, improve efficiency, and lower costs to the tune of billions of dollars in benefits to electricity customers.

A long journey to get it right

Transmission projects can cost-effectively accelerate our clean energy transition. But it must be done right with proper planning, stakeholder engagement, and diligent analytics.

Ensuring long-term investments in our transmission system provide benefits to customers is a lengthy process. Beginning in 2003, MISO—which operates the electricity transmission system and wholesale electricity markets across much of the central US—began to explore a regional planning process that would complement the local planning and activities of the utilities, states, and other stakeholders operating in its territory.

After several years of scoping, planning, analysis, and legal wrangling, a set of 17 “multi-value” transmission projects (MVPs) were approved in 2011 based on their projected ability to (1) provide benefits in excess of costs, (2) improve system reliability, and (3) provide access to renewable energy to help meet state renewable energy standards.

Even six-plus years after being approved, most of these projects are currently under construction since transmission projects typically take several years to move through the approval process, permitting, siting, and construction. But even as these projects are being developed, MISO has continued to evaluate them based on the most recent information available—making sure that they are still expected to deliver the benefits originally projected.

The most recent review, fortunately, shows that they are truly living up to their “multi-value” moniker. And like a fine wine, they seem to be getting better with time.

Latest review shows benefits increasing compared to original projections

Overall, the latest review shows a benefit to cost ratio ranging from 2.2 to 3.4—meaning these projects are expected to deliver economic benefits on the order of $2.20 to $3.40 for every dollar in cost. This is an increase over the original projection of a cost benefits ratio of 1.8 to 3.0. The latest cost/benefit analysis equates to total net economic benefits between $12.1 and $52.6 billion over the next 20 to 40 years. The figure below shows how the multiple values projected from these projects add up.

The chart above shows the categories – and projected value – of benefits (columns one through 6) that MISO considers in identifying and approving projects. When stacked up, the total benefits range from $22.1 to $74.8 billion. When total costs are also considered, net benefits (the last column on the right) to the MISO System and customers that rely on it drop to between $12 and $52.6 billion. Source: MISO

As shown in the figure, the bulk of economic benefits flowing from the MVPs are from relieving congestion and saving on fuel costs (shown in column 1). These are typically characterized as increasing “market efficiency” by opening up wholesale electricity markets to more robust competition and spreading the benefits of low-cost generation throughout the region—essentially allowing cheap energy to flow where there’s demand. Because renewable energy has zero fuel cost, enabling more of it onto the grid allows the overall system to operate more cheaply. These savings ultimately flow to ratepayers that are typically on the hook for fuel costs incurred by their utility.

And the amount of wind energy that is being brought onto the system because of these MVPs is significant. This latest review by MISO estimates that the portfolio of projects, once completed, will enable nearly 53 million megawatt-hours of renewable energy to access the system through 2031. To put that in perspective, a typical home uses about 10 megawatt-hours per year. So that’s enough energy to power 100,000 households for more than 50 years!

A lot more than just electricity

When put together, the combination of well-thought-out transmission investments and renewable energy development in the Midwest also provides a host of additional social benefits, including:

  • Enhancing the diversity of resources supplying electricity to the system
  • Improving the robustness of the transmission system that decreases the likelihood of blackouts
  • Increasing the geographic diversity of wind resources, thereby improving average wind output to the system at any given time
  • Supporting the creation of thousands of jobs and billions of dollars in local investment
  • Reducing carbon emission by 13 to 21 million tons annually

Let’s think about this for one second more…

Through proper planning, stakeholder engagement, and diligent analytics, here in the Midwest we are building a portfolio of transmission projects that will significantly lower carbon emissions, enable billions of dollars in investment and thousands of new jobs, make our electricity supply more reliable, and provide billions in economic benefits to ratepayers.

Maybe we should think about it for one more second. Or maybe we should start thinking about what’s next?

Source: MISO

The Penn State Science Policy Society: Filling the Gap Between Science and Community

Graduate school. It’s where generations of scientists have been trained to become independent scientists. More than 60 hours per week spent in lab, countless group meetings, innumerable hours spent crunching data and writing manuscripts and proposals that are filled with scientific jargon.

Unfortunately, it’s this jargon that prevents scientists from effectively communicating their science to the non-technical audiences that need it. Penn State’s Science Policy Society aims to bridge this gap by helping current graduate students and post-doctoral fellows learn how to bring their research into the community.

We occupy an important niche at Penn State as we continue to educate members of the Penn State community about the connection between our research and public policy, with a dedicated focus on science advocacy. We are helping our future scientists translate their stories and make connections with community members and policy makers.

Identifying a gap between science and community

Penn State researcher Dr. Michael Mann discussing the science behind climate change at Liberty Craft House in downtown State College.

Early on, we recognized a growing disconnect between the local State College community and the groundbreaking research occurring at Penn State. A growing desire within the Science Policy Society became apparent. Our members wanted to help our fellow community members, but we didn’t have the skills or the relationships within the community. We began to plan events to address this problem, looking to others who have fostered strong community ties as guides.

We began our relationship with the Union of Concerned Scientists (UCS) in March 2016 when Liz Schmitt and Dr. Jeremy Richardson came to Penn State to discuss UCS’s efforts to promote science-community partnerships. In May 2016, SPS members traveled to Washington D.C. to meet with UCS staff for science advocacy training. With the help of UCS, we have been able to begin to build our own community relationships. We started with Science on Tap, a monthly public outreach event designed to showcase Penn State science in a casual downtown bar setting. By having leaders in science-community partnerships to guide us, we have been able to begin our own journey into outreach.

Science & Community: A panel event

While our Science on Tap events were successful, we still felt there was still a gnawing gap between Penn State science and our local community. The local news was filled with science-related issues in State College and the surrounding central Pennsylvania region, but it wasn’t obvious how science was being used to help decision makers. We recognized an urgent need to learn how other scientists use their science to help, or even become, activists that fight for their local community.

The Science Policy Society panel discussion on Science & Community. From left to right: Dr. David Hughes, Dr. Maggie Douglas, and Dr. Thomas Beatty.

On September 14, 2017, the Science Policy Society partnered with the Union of Concerned Scientists to organize an event called “Science & Community.” Taking place at the Schlow Centre Region Library, the event was a panel discussion focused on how scientists and community activists can work together. The event featured three Penn State researchers: Dr. Maggie Douglas and Dr. David Hughes from the Department of Entomology, and Dr. Thomas Beatty from the Department of Astronomy and Astrophysics. Dr. Douglas works closely with local beekeepers and farmers to promote pollinator success, while Dr. Hughes is a leading member of the Nittany Valley Water Coalition, an organization that aims to protect the water of State College and the farmland it flows under. Dr. Beatty is a member of Fair Districts PA and speaks across central Pennsylvania about gerrymandering.

All three of these scientists saw problems in their community and decided to take action. Even more remarkable, most of these issues are outside their areas of scientific expertise. Astronomers typically aren’t trained in political science, but that did not stop Dr. Thomas Beatty from applying his statistical toolset to impartial voter redistricting. Same with Drs. Hughes and Douglas, who took their expertise into the community to help farmers and beekeepers protect their livelihoods.

Lessons learned

Easily the most important lesson that we learned from this Science & Community panel event was how hard it is for scientists to move into the local community and begin these conversations and partnerships. There was an overwhelming sense that the majority of the scientists in attendance did not feel comfortable using their scientific expertise to engage on local community issues. The reasons were numerous, but seemed to focus on (1) not knowing how to translate their science so that it is useful for non-specialists and (2) not having enough room in their schedule.

Moving forward, the Science Policy Society is aiming to address these concerns as we work towards filling the void between Penn State science and the surrounding communities. For example, we will be hosting science communication workshops to train scientists on how to strip jargon from their story of scientific discovery. Additionally, a panel event currently being planned for Spring 2018 aims to discuss how science and religion are not mutually exclusive, and will show how scientists can work with religious organizations and leaders to promote evidence based decision-making.

Graduate students looking to help their community are not given the necessary tools needed to do so. Hours spent in lab and at conferences talking only in scientific jargon leaves many unable to talk about their science to the general public. The Science Policy Society is filling this need by providing an outlet for scientists to learn communication and advocacy skills and begin to build relationships with community members and policy makers. With help from scientists and science outreach professionals, we are fostering science and community partnerships in State College and throughout central Pennsylvania.

 

Jared Mondschein is a Ph.D. Candidate in the Department of Chemistry at Pennsylvania State University. He was born and raised near New York City and earned a B.S. in chemistry from Union College in 2014. He is currently a Ph.D. candidate in the Department of Chemistry at Penn State University, where he studies materials that convert sunlight into fuels and value-added chemical feedstocks. You can find him on Twitter @JSMondschein.

Theresa Kucinski is a Ph.D. Candidate in the Department of Chemistry at Pennsylvania State University. She was born and raised in northern New Jersey, earning her A.S. in chemistry at Sussex County Community College in 2014 and B.A. in chemistry from Drew University in 2016. She currently studies atmospheric chemistry at Penn State University as a Ph.D. candidate in the Department of Chemistry.

Grayson Doucette is a Ph.D. Candidate in the Department of Materials Science and Engineering at Pennsylvania State University. He was born into a military family, growing up in a new part of the globe every few years. He earned his B.S. in Materials Science and Engineering at Virginia Tech in 2014, continuing on to Penn State’s graduate program. At PSU, his research has focused on photovoltaic materials capable of pairing with current solar technologies to improve overall solar cell efficiency. You can find him on Twitter @GS_Doucette.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

Why You Shouldn’t Feel Bad About Recycling Old Appliances

Let’s face it: Deep down inside you, or maybe much closer to the surface, you’ve been wanting a new refrigerator, dishwasher, washer, or dryer. You’ve had your eye on that sweet little white/black/stainless beauty of a machine, and you’ve seen the holiday sales (pick a holiday, any holiday) come and go, with their “Save $200!… Free delivery!… Act now!” enticements… And yet you’ve stayed on the sidelines.

If what’s been holding you back is concern about what happens to old appliances, landfills and all, I’ve got great news for you: Chances are good that you’re better off if you upgrade, because energy efficiency progress means you can save plenty of money—and that all of us are also better off because that progress means your upgrade also cuts emissions, even when you take the bigger picture into account.

New appliances make financial sense

It should be really clear that new appliances can save you a bunch of money by saving energy (and more). Federal efficiency standards for fridges that came into place in 2014 meant electricity savings of 20-25% for most, and units qualified under the ENERGY STAR program offer at least another 9% savings.

For washing machines, ENERGY STAR-rated ones use 25% less energy and 45% less water than their conventional brethren, which means less money spent on both energy and water. Upgrading from a standard washing machine that’s 10 years old can actually save you more than $200 a year.

New appliances make environmental sense, too

So that’s the financial side of things. And we both know that’s important.

But we also both know that you’re about much more than that. You’re thinking about how that dishwasher doesn’t just magically appear, about how the old one doesn’t just vanish. You’re thinking about the implications from each stage of its life. So what about the carbon emissions, you say.

Thinking about what goes into producing and disposing of something makes a lot of sense, as long as you’re thinking about what goes into operating that same something during that long period between production and disposal (the life of the product).

And it makes even more sense to use data to help that thinking. (You’re a Union of Concerned Scientists type of person, after all; you just can’t help it.)

Fortunately, we’ve got that. Cooler Smarter, UCS’s book on where the carbon emissions come from in our lives—which of our consumer decisions have the most impact on how much CO2 we emit—has just the data you need. (In the appendices; we didn’t want to scare off other people.)

And what Cooler Smarter’s data tables show is that the emissions associated with producing and disposing of a range of appliances add up to less than the emissions associated with their use. A lot less actually: Using them can take 10-25 times as much energy as getting them there and getting rid of ‘em.

Getting Cooler Smarter about where the emissions come from, with data. Turns out that “Use Emissions” are usually the big piece. (Source: Cooler Smarter)

What that means is that if you can upgrade an appliance to one that’s more efficient, and particularly if your existing helper is more than a few years old, it’s probably really worth it not just from a financial perspective, but also in terms of carbon pollution.

That same principle, by the way, holds true for other energy users around your house: think lighting, for instance, where CFLs (compact fluorescent lights) or even newer LEDs (light-emitting diodes) in place of incandescent light bulbs can really quickly save you a bundle and pay back the emissions that went into make them. Or think vehicles, where recent years’ efficiency gains have been really impressive.

As it says in Cooler Smarter:

When there are highly efficient options for appliances, equipment, and vehicles, for instance, it almost always makes sense to junk energy hogs in favor of the most efficient models you can afford.

Four decades of progress in a box: Bigger fridges, more features, a lot less energy. (Source: ACEEE)

Old appliances can be reborn

For the disposal piece of the carbon equation, one key to making the math work for an appliance’s afterlife is to dispose of it the right way. While photos of piles of old appliances might be eye-catching—and disheartening—your old faithful dishwasher, washing machine, dryer, or fridge doesn’t have to suffer that ignominious end.

In fact, it’s a whole lot better if it doesn’t, and there are lots of ways to make it so. ENERGY STAR has a useful set of webpages on recycling old appliancesrefrigerators, clothes washers, other appliances, and more. It suggests, for example, that recycling can be through the store you’re buying the new appliance from, through your local utility, through your city or town, or via a scrap dealer.

How your old fridge gets new life (with the help of a Hammond B3 organ soundtrack) (Source: ENERGY STAR)

As for where the old appliance goes/how the materials find new life: Fridges are a useful, complex array of materials that provide useful insights (and fodder for graphics). ENERGY STAR has a handy video about all the pieces and how they get reborn. (The shredding part about two-thirds of the way through isn’t for the faint of heart, particularly the appliance-loving heart, but just remember that it’s all for the greater good.) And the efficiency program in top-ranked Massachusetts not only gives the lowdown on fridge recycling (and a cool infographic), but offers free removal and $50 to boot.

That new-life-for-old idea can work for other things, too. If it’s lights you’re swapping out, here are a few ideas on what to do with old incandescent light bulbs (sock-darning, for example). For vehicles, check out UCS’s cradle-to-grave analysis.

Don’t you deserve lower costs, more comfort, less pollution, more…?

A new washer and dryer set might not fit under the Christmas tree, but that shouldn’t keep you from upgrading. Neither should concerns about what happens to the old one, or where the new one comes from.

As Cooler Smarter‘s section on “stuff we buy” lays out, there’s a lot to be said for buying less, and buying smart. But efficiency gains change the equation for some things.

If you feel you deserve new appliances, you just might be right. And if you think that upgrading to much higher efficiency ones and recycling the old might be a good move, you’d definitely be right.

Energy efficiency truly is the gift that keeps on giving, for both the wallet and the planet.

So act now—retailers are standing by!

Automakers’ Long List of Fights Against Progress, and Why We Must Demand Better

Vehicle pollution is a major issue for human health and the environment.

Today, we are releasing a report documenting the long, sordid past of the auto industry, who has fought regulation tooth and nail at every turn. From pollution control to seatbelts and air bags to fuel economy, the industry has spent the vast majority of the past 7 decades doing whatever it can to wriggle out of government regulations, at the expense of the American public.

Cars have drastically improved, but not without a fight

Time for a U-turn looks at the tactics that automakers consistently deploy to fight against federal rules and standards that deliver better cars to the nation, tactics like exaggeration, misinformation, and influence. It also outlines concrete actions that automakers can take to leave behind their history of intransigence, and ensure that their industry rises to the challenges of the 21st century.

There is no doubt that the cars built today are significantly improved over the vehicles from the 1950s:

  • today’s safety standards require not just airbags and seatbelts but also features like crumple zones which help to minimize occupant injury;
  • tailpipe pollution standards have dramatically reduced the emissions of soot and smog-forming pollutants like volatile organic compounds and nitrogen oxides; and
  • fuel economy and global warming emissions standards have saved consumers about $4 TRILLION dollars in fuel, severely reducing both the demand for oil and impact on climate change.

It’s clear when put to the task, automotive engineers have been more than capable of meeting whatever challenge is laid in front of them, resulting in a tremendous positive impact for the public.

Unfortunately, the industry has a long history of putting its lobbyists to work instead, promoting misleading claims and interfering politically to weaken or delay the standards that protect the public.

Automotive Chicken Little and a “Can’t Do” Attitude

One of the most frustrating aspects of the volumes of research I did for this report was the sheer repetition of the arguments.  According to the auto industry, any type of regulation would force them out of business…and yet they are still here.  Here are a few examples:

“[I]f GM is forced to introduce catalytic converter systems across the board on 1975 models . . . it is conceivable that complete stoppage of the entire production could occur, with the obvious tremendous loss to the company, shareholders, employees, suppliers, and communities.” – Ernie Starkman (GM) in his push to weaken the 1975 tailpipe emissions standards put in place by the Clean Air Act.

Not only was Starkman wrong that catalytic converters would shut down GM, but they proved so popular that GM actually used them in its advertising in 1975!

“Many of the temporary standards are unreasonable, arbitrary, and technically infeasible. . . . [If] we can’t meet them when they are published we’ll have to close down.” – Henry Ford II (Ford), responding to the first motor vehicle safety standards.

Clearly, Ford did not have to close down.  In fact, Ford proved more than capable of meeting these “unreasonable” requirements by using features like safety glass and seat belts, which are commonplace today.

“We don’t even know how to reach [35 miles per gallon by 2020], not in a viable way.  [It] would break the industry.”  — Susan Cischke (Ford), discussing the requirements of the Energy Independence and Security Act (EISA) that have led to the strong standards we have today.

Not only have strong fuel economy standards not broken the industry, but today it is thriving, with three consecutive years of sales over 17 million, an historic first for automakers.  And because of standards that drive improvements across all types of vehicles, we are not only on track to meet the requirements of EISA but doing so in spite of a growing share of SUVs and pick-ups.

Fighting the Science

Of course, even worse than the repetitive “sky is falling” attitude that has proven false at every turn is the assault on science that automakers have used in the past, seeking to eliminate policy action by diminishing either the solution or the problem:

“We believe that the potential impact of [fuel economy standards] on the global issue of planetary warming are [sic] difficult to demonstrate.” – Robert Liberatore (Chrysler)

Believe it or not, after James Hansen’s Congressional testimony in 1988, there was bipartisan support on the Hill to address climate change, including from transportation-related emissions.  Mr. Liberatore used an argument straight out of today’s Heritage Foundation claiming that fuel economy standards in the United States won’t have an impact on a global problem.  This flew in the face of science then, just as it does now.

“The effects of ozone are not that serious . . . what we’re talking about is a temporary loss in lung function of 20 to 30 percent.  That’s not really a health effect.” – Richard Klimisch (American Automobile Manufacturers Association).

In 1996, the EPA was moving forward to strengthen air quality standards for ozone (related to smog) and soot (particulate matter).  In order to push back on this solution, automakers campaigned against there even being a problem to address, claiming that a little loss in lung function wasn’t a big deal.  Needless to say, the EPA ignored this ridiculousness and implemented stronger standards. However, even these stronger standards did not fully address the problem, pushing the Obama administration to move forward on strengthening the standards further still.

Breaking the Cycle?

After the Great Recession, automakers seemed to turn over a new leaf, working closely with the Obama administration to craft stringent fuel economy and emissions standards that would drive efficiency improvements across all types of vehicles, including SUVs and pick-up trucks.

“[The industry has] had a change of heart, but it’s fairly recent. We had data about consumers’ preferences about fuel economy, but we chose to ignore it; we thought it was an anomaly. But it’s by having a bias against fuel economy that we’ve put ourselves in the pickle we’re in now.”  — Walter McManus (ex-GM), speaking about a shift in automaker thinking.

Unfortunately, this awakening seems to have been short-lived, as automakers are now urging the current administration to weaken the standards with the same types of tactics we’ve seen before:

  • Automakers are using direct political influence, sending a letter to the Trump administration to withdraw EPA’s determination that the strong 2025 standards remain appropriate.
  • Automakers are again exaggerating the facts, claiming widespread catastrophe if the EPA does not alter the standards based on a widely debunked study and ignoring the findings of a more thorough (albeit still conservative) report they themselves funded because it doesn’t fit their messaging.
  • Industry is pushing to expand the midterm review to include lowering the 2021 standards while acknowledging that lowering the 2021 standards would have no impact on their product offerings and simply is a form of regulatory relief “any way we can get it” (Chris Nevers, Alliance of Automobile Manufacturers).

Despite talking a good game about being “absolutely committed to improving fuel efficiency and reducing emissions for our customers” (Bill Ford, 2017), Ford and other automakers are engaging in the same intransigence we’ve seen over the past seven decades.

It’s time for automakers to end this multidecadal war against regulation and start siding with progress.  To build back trust and leave this history behind, automakers must seize this opportunity and:

  • support strong safety and emissions standards and keep the promises they made to the American people to build cleaner cars;
  • distance themselves from trade groups that seek to undermine today’s standards, and make it clear that these groups do not speak for all automakers on issues of safety and the environment; and
  • cease spreading disinformation about the standards and their impacts.

Supermoons, King Tides, and Global Warming

The moon rises behind the US Capitol. Photo by: Jim Lo Scalzo, EPA

Were you, like me, dazzled by the supermoon this weekend? Did you also stare in a state of wonder at the bright and shiny orb of color illuminating the night? Supermoons happen when a full or new moon is at its closest point to Earth. While we can’t see them during the new moon, supermoons that occur during a full moon are indeed something to behold. They bring thoughts of the universe, of space, stars and planets.

Flooding in Boston wharf.
Photo by: MyCoast, Christian Merfeld

But while we are turning our heads to the sky, we may not realize what’s happening at our feet. The moon might be out in space, but its movement has real impacts here on Earth, specifically on the oceans. I am talking about tides.

Tides are all about big masses of land and water pulling one another in a gravitational act. Tides are always higher at full and new moons — when the Moon, Earth, and Sun are aligned — and it follows that the gravitational pull is strongest when the masses are at their closest during a supermoon. That’s why we saw some unusually high tides, called king tides, across the country (and beyond) at the same time that we experienced the supermoon.

So, while we may not realize it when looking at the supersized moon, it is causing a great deal of disruption to people’s lives in the form of tidal flooding, also called “nuisance flooding.” As stated in one of my colleague’s earlier blogs, this localized tidal flooding has been steadily increasing due to sea level rise. And climate change is behind the sea level rise rates being observed.

The recently released Climate Science Special Report (CSSR) states with very high confidence that “global mean sea level (GMSL) has risen by about 7–8 inches (about 16–21 cm) since 1900, with about 3 of those inches (about 7 cm) occurring since 1993”, and rising will continue throughout the rest of the century at accelerated rates. Rates of sea level rise in many locations along the coast of the U.S. have been higher than the global average, and nuisance flooding is now 300% to more than 900% more frequent than it was 50 years ago in many of those locations.

Many cities have initiatives to track tides and some are specifically geared toward monitoring king tides. Volunteers with “Catch the King,” an initiative by the Virginia Institute of Marine Science, can use a smartphone app to map flooded areas in Hampton Roads in real time. The group then uses the collected data to improve predictions and forecasts, and to better understand the risks from tidal flooding.

Similarly, the “My Coast” project asks Massachusetts residents to submit pictures of areas inundated by king tides to catalogue the effects of these events on the state’s coastal areas. Ultimately, these types of initiatives are geared towards improving resilience and preparedness, informing residents of impassable areas and floodwater reach.

The amount of emissions currently released into the atmosphere has already committed us to a certain amount of sea level rise through midcentury, simply because these warming gases remain in the atmosphere for a long time. However, decisions made in the next few years will determine how much the sea will rise in the second half of the century – reducing emissions can reduce the rates of rise and potentially save hundreds of coastal communities from tidal flooding.

So next time you look up at a supermoon (in January 2018), while still marveling at the incredible phenomenon you are witnessing, remember to also look down. It may just make you think about the moon in a completely different way – and how as a nation, we need to do more to reduce emissions and prepare for coastal flooding.

It’s World Soils Day: Celebrate Soil, Carbon, and the Opportunities Right Under Our Feet

These days, stories about soil health and regenerative farming seem to be catching on, so much so that it’s almost hard to keep up, at least for the avid soil geek.  The New York Times and the Huffington Post both featured op-eds just last week explaining why soil is worth getting excited about, while tales of soil health and science from North Dakota to New England were recently shared by other sources.  Yesterday, NPR hosted an hour-long panel on soil health. And that’s just a short list.

Maybe the rush of soil-slanted stories has something to do with today being World Soils Day. Or maybe it’s because soils and agriculture finally got some love at the latest climate convention.  Or perhaps it has to do with the growing list of states that are working towards healthy soils policies, or that the conversation surrounding the next Farm Bill has actually included soil health.

Or, just maybe, it’s because people are figuring out that the soils beneath our feet, and the farmers and ranchers that tend to them, need more of our attention.  After all, healthy soils are the living, breathing ecosystems that help grow our food, clean our water, store carbon, and reduce risks of droughts and floods.  Together, soils and their stewards can produce food while making agriculture part of the solution to several challenges (including climate change). Let me explain.

Soils stash carbon and deliver services

Some of the amazing features of soils that are finally being celebrated are not new. For some time, scientists have known that soils store a lot of carbon (about three times more than the atmosphere), and that carbon-rich soils tend to hold more water.  They have also known that soil varies a lot, even across small distances, that it changes over time, and that it is affected by management practices.  But we also know that there’s a lot we don’t know.  Thankfully, that’s starting to change.

Getting the numbers right on how soil can fight a changing climate (because we can’t afford not to)

Even just in the past year, soil science – including soil carbon science –  has advanced, pushed along by new tools, interests, and urgency.  A lot of the urgency has come as climate change picks up the pace. Today, scientists say that we can’t afford to choose between reducing emissions and sequestering carbon – we must do both.  That puts a spotlight (and pressure) on soils.

Fortunately, new science is rapidly uncovering more details about soils.  For example, pivotal papers have discussed how specific soil-based management practices could help mitigate climate change, and how soil carbon sequestration could be scaled up in the US and around the globe to achieve significant outcomes. Within the past months, key papers demonstrated that the majority (75%) of the organic carbon in the top meter of soil is directly impacted by management and that croplands may hold particular potential to be managed for carbon sequestration, but that soils continue to be at risk.

It’s important to note that while many studies have stressed opportunities in soils, others have questioned them.  For example, some research has suggested that soils may not be able to hold as much carbon as some scientists think, while other research has indicated that links between soil carbon and water are not as strong as previously thought.  Other research has questioned whether certain practices (e.g., abandoning cropland) can bring expected benefits.

In my opinion, all these studies just make more research more important.  Getting the numbers right will help us to find, and fine-tune, the best solutions for healthier, more resilient soil. But as we work out these details, we also need to act – and fast.

The role of farmers and ranchers in bringing out the best in soils, for better farms and futures

Fortunately, many farmers and ranchers already know how to build soil health (and carbon) on their land – and they are taking action (lucky for us, because the health of the soil is in their hands). Farmers and ranchers like Gabe Brown (ND), David Brandt (OH), Will Harris (GA), Ted Alexander (KS), and Seth Watkins (IA), just to name a few, have been experimenting for years with ways to build soil health for more resilient land.  New research from South Dakota shows that farmers are adopting cover crops and other practices in large part to build soil health.  And a growing list of companies and non-profits have supported a standardized definition of regenerative agriculture, suggesting that these healthy soils practices are gaining even more traction.

Recognizing the soils and stewardship beneath food “footprints”

As important as soil carbon, health, and stewardship are to ensuring farms are functioning at their best, it’s surprising that we think so little about them.  There is a larger discussion going on around sustainable diets and the notion that food has an environmental “footprint,” but the fact is that most of the studies that seek to quantify the carbon (or water, or land) footprints of food items haven’t accounted for the role of soil management and stewardship. Therefore, while the conversation about the impact of consumers’ food choices has been an important starting point, we also need to understand how the decisions made by farmers affect the world around us. That means bringing soil carbon to the table, and the sooner the better. With the growing appreciation for soil health science, practice, and story-telling, I think we might be getting somewhere.

P.S.  Prefer a little video inspiration? There’s plenty to choose from if you want to learn the basics of soil organic carbon, how “dead stuff” is key to the food chain, how healthy soils reduce flood risk, or more about the 4 per mille campaign, which puts soils at the forefront of climate change solutions.

Pruitt’s War on the Planet and the EPA—and What Congress Can Do About It

We have now endured almost a year with Scott Pruitt as the head of the Environmental Protection Agency (EPA). His tenure is unprecedented—a full frontal assault on the agency he heads, and a retreat from the mission he is charged by law to advance. And thus far, Administrator Pruitt has not had to account for his actions.

But an accountability moment is nearing: for the first time since his nomination, Mr. Pruitt will appear before Congress to offer an update on the status of work at the agency—first before the House Energy and Commerce Committee on December 7, and next before the Senate Environment and Public Works Committee on January 31. These oversight hearings offer a critical opportunity for leaders on both sides of the aisle to ask tough questions, demand responsive information rather than platitudes, and voice their disapproval about how Administrator Pruitt has run the EPA.

Here are key topics for our elected representatives to focus on:

Mr. Pruitt’s empty “back to basics” promise

During his nomination hearing last January, Administrator Pruitt knew he would be questioned about his commitment to EPA’s mission and his repeated lawsuits against EPA when he served as Oklahoma’s attorney general. He came equipped with a clever counter-narrative. He claimed that he would make EPA a more effective agency by de-emphasizing “electives” such as climate change. He promised to steer the agency “back to basics” by focusing on core responsibilities such as enforcing clean air and water laws and cleaning hazardous waste sites.

Members of Congress should compare that promise to Administrator Pruitt’s actions over the past year. Almost immediately after taking office, he signed off on a budget that would cut EPA by 31 percent, despite the absence of any financial exigency requiring such draconian action. A few weeks later, he approved plans to lay off 25 percent of the agency’s employees and eliminate 56 programs. The proposed budget cuts target not only items Pruitt may think of as electives, but also basic bread-and-butter functions. For example, he proposed to strip $330 million from the $1.1 billion Superfund program and cut funding for the Justice Department to enforce cases.

And, in a clear contradiction of his testimony that he would work more cooperatively and effectively with state environmental protection agencies, he proposed to cut the grants that EPA gives to states for enforcement by 20 percent.

We are already starting to see the results of this effort to hollow EPA out from within. Experienced and talented career staff are leaving the agency in droves. The Chicago EPA office, for example, has already lost 61 employees “who account for more than 1,000 years of experience and represent nearly 6 percent of the EPA’s Region 5 staff, which coordinates the agency’s work in six states around the Great Lakes.” This means, among other things, a smaller number of inspectors and likely an increased number of businesses operating out of compliance with clean air and water laws.

With less staff and fewer experienced staff members, it is no surprise that EPA has seen a roughly 60 percent reduction in the penalties it has collected for environmental violations compared with the Obama, Bush, and Clinton administrations at comparable stages in their respective terms. And while the Obama administration cleaned up and de-listed 60 hazardous waste sites and added 142 sites over eight years, so far the EPA, under Mr. Pruitt, is far off that pace, deleting just two sites and adding only seven.

Perhaps most troubling, civil servants have been deeply demoralized by the combination of proposed cuts and constant statements by the president and Administrator Pruitt denigrating the agency as a job killer, which it is not. As one staffer said in a recent publication entitled EPA under Siege “I think there’s a general consensus among the career people that, at bottom, they’re basically trying to destroy the place.”

Said another: “Quite honestly, the core values of this administration are so divergent from my own, I couldn’t pass up the opportunity [for retirement]….I found it difficult to work for an agency with someone who is so disrespectful of what we do and why we do it.”

Members of Congress should question Mr. Pruitt about his “back to basics” promise. They should ask why he advocated for such deep budget cuts, layoffs, and buyouts, and demand that he explain with specificity how the agency can possibly do better with such drastically reduced resources. Congress should also require Mr. Pruitt to provide clear, apples-to-apples comparisons of the record of environmental enforcement during his tenure with that of his predecessors, as measured by inspections, notices of violation, corrective actions, fines and litigation.

Administrator Pruitt’s “Law and Order” charade

Administrator Pruitt put forth a second narrative during his confirmation hearing. He promised  to restore “law and order” to EPA, claiming that the EPA had strayed beyond its statutory authority during President Obama’s tenure.

The record tells a very different story. In less than a year, Mr. Pruitt’s actions have repeatedly been found by courts to be “unlawful,” “arbitrary,” and “capricious.”

One example is particularly instructive. At the end of the Obama administration, the EPA issued a final rule requiring operators of new oil and gas wells to install controls to capture methane, a highly potent contributor to global warming. The rule was set to go into effect in early 2017. Administrator Pruitt unilaterally put the rule on hold for two years to allow EPA to conduct a sweeping reconsideration. This, the court found, was blatantly illegal, because it attempted to change the compliance date of a rule without going through the necessary rulemaking process.

Unfortunately, this tactic has become a pattern, as Mr. Pruitt has sought to put on hold many other regulations he doesn’t care for, including rules intended to reduce asthma-causing ozone pollutiontoxic mercury contamination in water supplies, and a requirement that state transportation departments monitor greenhouse gas emission levels on national highways and set targets for reducing them. Environmental nonprofit organizations and state attorneys general have had to sue, or threaten to sue, to stop this illegal behavior.

The EPA’s lawlessness is not confined to official acts, but also concerns the administrator personally. In an obvious conflict of interest, Mr. Pruitt played a leading role in the EPA’s proposed repeal of the Clean Power Plan, the nation’s first-ever limit on carbon dioxide pollution from power plants. Yet, just a few months before taking over at the EPA, Mr. Pruitt had led the legal fight against the rule as Oklahoma’s attorney general.

In effect, he played the role of advocate, then judge and jury, and ultimately executioner, all in a matter of a few months.

In addition, Administrator Pruitt is under investigation for misusing taxpayer dollars for $58,000 worth of private chartered flights, and has wasted $25,000 of taxpayer money to build himself a secret phone booth in his office.

Congress needs to ask Mr. Pruitt how he can be said to have restored respect for the law at the EPA, when the EPA (and perhaps Administrator Pruitt personally) have been flouting it. They need to ask him about what role he played in the proposed repeal of the Clean Power Plan, and how he can square his conflicting loyalties to the state of Oklahoma (which he represented as an attorney) and to the American people (who he is supposed to represent as head of the EPA). Congress should also investigate his personal use of taxpayer funds and his penchant for cutting corners on legally mandated processes.

An “Alice in Wonderland” approach to science

The EPA’s five decades of success rest on its longstanding commitment to the best available science, and to its well-trained professional scientists who deploy that science. Administrator Pruitt has taken a wrecking ball to this scientific foundation.

First, he ignores staff scientists when their conclusions do not support his deregulation agenda. On the crucial scientific question of our time—climate change and what is causing it—Mr. Pruitt says he does not believe carbon dioxide is a primary cause. Of course, this statement runs directly counter to the conclusions of EPA scientists (as well as those of the recently issued US Global Change Research Program Climate Science Special Report). And, in one of his first policy decisions, Administrator Pruitt overturned EPA scientists’ recommendation to ban a pesticide (chlorpyrifos) that presents a clear health risk to farmers, children, and rural families.

But Mr. Pruitt is not only ignoring staff scientists, he is also sidelining and suppressing advice from highly credentialed and respected scientists who advise the EPA. Last summer, he sacked most of the members of the Board of Scientific Counselors, a committee of leading scientific experts that advises the EPA about newly emerging environmental threats and the best use of federal research dollars. And he has used this as an excuse to suspend the board’s work indefinitely.

More recently, he issued a new policy which states that a key outside Science Advisory Board will no longer include academic scientists who have received EPA grants in the past, under the purported theory that the grants render them less objective. Yet, Administrator Pruitt will fill these posts with industry scientists who are paid exclusively by industry, and with scientists who work for state governments that receive grants from the EPA. This new policy has enabled Mr. Pruitt to fill these boards with scientists who are clearly aligned with industry, scientists such as Michael Honeycutt, who has railed against EPA limits on soot and even testified before Congress that “some studies even suggest PM [particulate matter] makes you live longer.”

Administrator Pruitt’s attack on science also includes the EPA deleting vital information from agency websites. For example, the EPA has deleted key information about the Clean Power Plan, even though the agency is in the middle of a public comment process on whether to repeal that rule, and what to replace it with. The EPA has also eliminated information on the “social cost of carbon” and the record of its finding that the emission of greenhouse gases endangers public health.

These deletions seem designed to make it more difficult for the scientific community, and members of the public, to access the scientific information that stands in the way of Mr. Pruitt’s agenda.

Congress needs to probe deeply on these multiple ways that Administrator Pruitt has diminished the role of science at EPA. Representatives and senators should make him explain why he thinks he knows more about climate science and the harms of pesticides than his scientists do. They should demand that he explain why it is a conflict of interest for academic scientists who receive EPA grants to advise the EPA, but not for state and tribal scientists who receive these grants, or industry-paid scientists. And Congress must find out why so much valuable information about climate science, the social cost of carbon, and other matters have vanished from EPA websites.

Making the world safe for polluters

In December 2015, more than 190 countries, including the United States, approved an agreement in Paris to finally tackle the greatest challenge of our time—runaway climate change. Donald Trump pledged to pull the United States out of this agreement when he ran for office, but for six months into his term, he did not act on the pledge, and there was an internal debate within his administration.

Mr. Pruitt led the charge for the US withdrawal from that agreement. He has followed up on this by going after almost every single rule the Obama administration had put in place to cut global warming emissions. This includes the proposed repeal of the Clean Power Plan, the “re-opening” of the current fuel economy standards that are now on target to roughly double cars’ fuel efficiency by 2025, the repeal of data gathering on methane emissions from oil and gas facilities, and tampering with how the EPA calculates the costs of carbon pollution, among many other actions.

But Administrator Pruitt’s rollback of safeguards is not limited to climate-related rules; it also includes cutting or undermining provisions that protect us all from more conventional pollutants. He has started the process of rescinding rules that limit power plants from discharging toxic metals such as arsenic, mercury and lead into public waterways; regulate the disposal of coal ash in waste pits near waterways; and improve safety at facilities housing dangerous chemicals.

The breadth and ferocity of these rollbacks is unprecedented. Congress needs to push back hard. For starters, representatives and senators need to demand that Mr. Pruitt explain how it fits within his job duties to lobby the president against one of the most important environmental protection agreements ever reached. Similarly, they need to highlight the impacts on human health and the environment from all of the rollbacks that Administrator Pruitt has initiated, and force him to explain how the EPA can be advancing its mission by lowering environmental standards.

Congressional oversight is needed now more than ever

Many aspects of Mr. Pruitt’s tenure are truly unprecedented. However, he’s not the first EPA administrator to display fundamental disrespect for the agency’s mission. As one legal scholar has noted, during the Reagan administration there were “pervasive” congressional concerns that former Administrator Anne Gorsuch and other political appointees at the agency “were entering into ‘sweetheart deals’ with industry, manipulating programs for partisan political ends, and crippling the agency through requests for budget reductions.”

Congressional oversight back then was potent: among other things, Congress demanded that the EPA hand over documents about the apparently lax enforcement of the Superfund law requiring cleanups of hazardous waste sites. When the EPA head refused to comply with those demands, Congress held Administrator Gorsuch in contempt. Senators, including Republicans such as Robert Stafford and Lincoln Chaffee, publicly voiced their alarm. Eventually, President Reagan decided Ms. Gorsuch was a liability, and he replaced her with William Ruckelshaus, EPA’s first administrator under President Nixon, and a well-respected moderate who stabilized the agency.

These oversight efforts were “the decisive factor in causing Ms. Gorsuch, as well as most of the other political appointees at the agency, to resign.”

It may be too much to expect that the current, polarized Congress will exhibit the same level of tough, bipartisan oversight it did in the Reagan era. Yet, bipartisan support for vigorous environmental protection remains strong today and some Republican leaders have already called upon Administrator Pruitt to step down. It is high time for Congress to do what it can to ensure that Mr. Pruitt’s EPA does not continue to put the interests of a few industries ahead of the clean air, water, and lands that the agency is mandated to protect.

The EPA Knows Glider Trucks Are Dangerously Dirty: It’s Time to Keep Them Off the Road

That shiny new truck could have a 15-year-old engine that doesn’t meet today’s standards. Photo: Jeremy Rempel. CC-BY-ND 2.0 (Flickr)

Today, I am speaking at a public hearing at EPA to push back on the agency reopening a “zombie truck” loophole. I wrote about the political motivations behind the attack on public health previously, but we now have even more information about exactly how dirty these trucks are from an interesting source: the EPA itself.

A reminder about what is at stake

Glider vehicles are brand new trucks that are powered by a re-manufactured engine.  While they look like every other new truck on the outside, on the inside they have engines which were manufactured under weaker pollution standards than other new trucks. Because they are resurrecting these older, more highly polluting engines from the dead, they are sometimes referred to as “zombie trucks.”

While initially glider trucks were used to replace vehicles whose bodies had been damaged, more recently a cottage industry has sprung up selling about 20 times more trucks than historic levels solely to bypass pollution restrictions.

In the “Phase II” heavy-duty vehicle regulations, the EPA closed the loophole that allowed these awful pollution spewers to be manufactured in the first place. However, Scott Pruitt’s EPA has proposed repealing this action, reopening the loophole primarily to benefit a company with political ties.

Dirty science for dirty trucks

In support of this repeal, Fitzgerald Trucks (the manufacturer requesting the loophole be reopened) submitted the results of a slapdash series of tests it claimed were from independent researchers.  However, the tests were paid for by Fitzgerald and conducted using Fitzgerald’s equipment in Fitzgerald’s facilities.  The results of the tests were incomplete and indicated that the work was sub-standard. However, we didn’t know just how unscientific the research was until EPA technical staff posted a memo detailing a meeting with the researchers.  Here are just a few of the absurd shortcomings in the tests:

  • Researchers did not use industry standard test procedure, so any numerical results could not be directly compared with regulatory requirements or literally any other research in the technical literature.
  • Researchers did not actually take samples of soot during testing, despite the fact that this is not just carcinogenic but one of the specific pollutants at issue with these engines which causes such detrimental health impacts.  Instead, they “visibly inspected” the test probe. Yes, you read that right–they just looked at it to see if it was dirty.
  • Researchers did not test under any “cold start” conditions. Like when you first turn on your car, this is when the engine emits elevated levels of pollution, which is why it is a standard part of regulatory tests for both cars and trucks.

Believe me when I tell you that I could not get my doctorate if my lab work were of that low quality.

Ignoring the EPA’s own technical data

While pointing to the subpar Fitzgerald / Tennessee Tech data, the EPA was actually aware of much higher quality data being done at its own facilities.  Instead of waiting for these tests to be completed, the politicos at EPA moved forward with the proposed repeal anyway.

Well, the results from those tests are in, and they are at least as bad as the EPA’s technical staff feared.  In fact, it may be even worse:

  • According to the test results, it appears that these engines actually exceed the legal limits they were initially designed for.  This means that the “special programming” of the engine Fitzgerald claims to do to the engines may result in greater fuel economy, but it means greater pollution, too.
  • The soot exhausted by these engines is so large that it caused a fault in the EPA’s equipment, after which the EPA had to adjust the throughput.  A good comparison to this is like when you have your volume adjusted for a TV program you like and then suddenly a really loud commercial comes on…except now imagine that commercial just blew out your speakers.

  • The two collectors on the left of this image are what happened when they first tried to collect the pollution from these vehicles; the two collectors on the right are what it looked like before the test.  Now imagine what that experience must be like for the lungs of a child with asthma.

The EPA had already projected that every year of production of glider vehicles at today’s levels would result in as many as 1600 premature deaths–this new data suggests that number could be even higher.

The science is clear, so closing this loophole should be the easy thing to do.

I am speaking today at the hearing against because I want to make sure EPA listens to its own scientists and closes this loophole, to abide by its mission statement and protect human health and the environment.  And today I will be among a chorus of dedicated citizens reminding the agency of its mission.

EPA

Vehicle Fuel Economy Standards—Under Fire?

Photo: Staff Sgt. Jason Colbert, US Air Force

Last year, transportation became the sector with the largest CO2 emissions in the United States. While the electricity industry has experienced a decline in CO2 emissions since 2008 because of a shift from coal to natural gas and renewables, an equivalent turnaround has not yet occurred in transportation. Reducing emissions in this sector is critical to avoiding the effects of extreme climate change, and the Corporate Average Fuel Economy (CAFE) and Greenhouse Gas (GHG) emissions standards are an important mechanism to do so.

The most recent vehicle standards, which were issued in 2012, are currently undergoing a review. The Department of Transportation (DOT) is initiating a rulemaking process to set fuel economy standards for vehicle model years 2022-2025. At the same time, DOT is also taking comments on its entire policy roster to evaluate their continued necessity (including the CAFE standards).

A number of criticisms have been raised about fuel efficiency standards, some of which are based more in confusion and misinformation than fact. An intelligent debate about the policy depends on separating false criticisms from those that are uncertain and those that are justified.

In fact, as new research I did with Meredith Fowlie of UC Berkeley and Steven Skerlos of University of Michigan shows, the costs of the standards could actually be significantly lower than other policy analyses have found.

Costs and benefits of the regulations

What my co-authors and I have found is that automakers can respond to the standards in ways that lower the costs and increase the benefits.

Many policy analyses do not account for the tradeoffs that automakers can make between fuel economy and other aspects of vehicle performance, particularly acceleration. We studied the role that these tradeoffs play in automaker responses to the regulations and found that, once they are considered, the costs to consumers and producers were about 40% lower, and reductions in fuel use and GHG emissions were many times higher.

The study finds that the fact that automakers can tradeoff fuel economy and acceleration makes both consumers and producers better off. A large percentage of consumers care more about paying relatively lower prices for vehicles than having faster acceleration. Selling relatively cheaper, more fuel-efficient vehicles with slightly lower acceleration rates to those consumers allows manufacturers to meet the standards with significantly lower profit losses. Consumers that are willing to pay for better acceleration can still buy fast cars.

Debunking some common criticisms

One common criticism is that the regulations mandate fuel economy levels that far exceed any vehicles today. This misconception stems from the frequently quoted figure when the regulations were first issued that they would require 54.5 mpg by 2025. But, the regulations do not actually mandate any fixed level of fuel economy in any year. The fuel-economy standards depend on the types of vehicles that are produced each year. If demand for large vehicles is up, the standards become more lenient; if more small vehicles are sold, they become more strict. The 54.5 mpg number was originally estimated by EPA and DOT in 2012 when gas prices were high. EPA has since revised it to 51.4 mpg to reflect lower gas prices and higher sales of large vehicles. Taking into account flexibilities provided in the regulations and the fact that this number is based on EPA’s lab tests, which yield higher fuel economy than drivers experience on the road, the average target for 2025 is equivalent to approximately 36 mpg on the road. Fueleconomy.gov lists 20 different vehicle models that get at least this fuel economy today.

Another common but unjustified criticism of the standards is that they push consumers into small vehicles. The regulations were specifically designed to reduce any incentive for automakers to make vehicles smaller. The standards are set on a sliding scale of targets for fuel economy and GHG emissions that depend on the sizes of the vehicles. As a result, an automaker that sells larger vehicles has less stringent fuel economy and emissions targets than one that sells smaller vehicles. Research has shown that the policy likely creates an incentive for automakers to produce bigger vehicles, not smaller.

Two easy ways to strengthen the fuel economy standards

There are, of course, advantages and drawbacks to any policy, including today’s vehicle standards, which focus entirely on improving the efficiency of new vehicles.  Fortunately, there are improvements that can be made to the CAFE and GHG regulations to increase their effectiveness and lower costs.

The first is ensuring that automakers that violate the standards pay very high penalties. Companies who cheat steal market share from those that follow the standards, effectively raising the regulatory costs for the automakers that are playing fair.

The second improvement involves the way automakers are able to trade “credits” with each other.  These credits were created to equalize regulatory costs across companies. So, if one automaker finds it relatively easy to reduce emissions, it can reduce more than its share and sell credits to another automaker having trouble reducing emissions. This trading is currently negotiated individually by each pair of automakers, which raises the costs of the transaction. Creating a transparent market to trade these credits would help to achieve the target emission reductions at lower costs.

The Department of Transportation (DOT), which implements the Corporate Average Fuel Economy (CAFE) standards, is currently soliciting comments on regulations “that are good candidates for repeal, replacement, suspension, or modification.” The comment period ends December 1.

 

Dr. Kate Whitefoot is an Assistant Professor of Mechanical Engineering and Engineering and Public Policy at Carnegie Mellon University. She is a member of the NextManufacturing Center for additive manufacturing research and a Faculty Affiliate at the Carnegie Mellon Scott Institute for Energy Innovation. Professor Whitefoot’s research bridges engineering design theory and analysis with that of economics to inform the design and manufacture of products and processes for improved adoption in the marketplace. Her research interests include sustainable transportation and manufacturing systems, the influence of innovation and technology policies on engineering design and production, product lifecycle systems optimization, and automation with human-machine teaming. Prior to her current position, she served as a Senior Program Officer and the Robert A. Pritzker fellow at the National Academy of Engineering where she directed the Academy’s Manufacturing, Design, and Innovation program.

 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Abnormal and Catastrophic 2017 Hurricane Season Finally Over

The official end of the 2017 North Atlantic hurricane season, November 30th, has finally arrived.  In the season’s wake many are mourning the loss of loved ones, repairing their homes, and still waiting for electricity to return.

Hurricane Tracks 2017

Figure 1. North Atlantic hurricanes and tropical storm tracks of the 2017 Season. Preliminary as the November storm tracks are not yet updated.

2017 North Atlantic Hurricane season was not normal

The first named storm of the 2017 Hurricane Season, tropical storm Arlene, began in early April.  Harvey, Irma, and Maria are the names communities will remember long after they became major hurricanes.

Six of the ten hurricanes were major (greater than category 3).  Recalling the headlines and seeing the damages, the season was catastrophic (Figure 1).  Crunching the numbers on a measure of power – Accumulated Cyclone Energy (ACE) – confirms that impression.  September 2017 ACE was more than three times greater than historical September ACE average over 1981-2000.  Scientists are piecing together the factors that contributed to such an intense hurricane season.  Attribution studies (studies that attribute the relative role of human and natural factors in the occurrence of extreme weather) are already been published about a specific Hurricane from 2017.

Some extraordinary conditions of this hurricanes season:

Hurricane Ophelia SST Anomalies Oct 2017

Figure 2. Warmer than 1985-2012 average sea surface temperatures (SSTs) during the time when tropical storm Ophelia transitioned into a hurricane south of the Azores Islands.

Warmer Seas –  A big factor contributing to the intensification of Harvey, Irma, and Maria was the warmer than average sea surface temperature (SST) conditions.  Another surprising consequence of the warmer than average SSTs was the hurricane region extended beyond typical hurricane regions of the North Atlantic Ocean.  This allowed hurricane Ophelia to thrive in highly unusual latitudes and longitudes making it the easternmost hurricane to date (see storm track number 15 in Figure 1).  The extratropical storm Ophelia made landfall in Ireland and brought waves that battered the UK coast, drenched northern Europe and blew winds that fueled lethal wildfires in southern Europe.  Research suggests that heat-trapping emissions can extend the SST region favorable for hurricanes and increase the chances for these storms to head toward western Europe.

Figure 3. Record-breaking precipitation dropped along the Texas and Louisiana coastal region.

Record Breaking Precipitation – Hurricane Harvey dropped a whopping 60 inches in Nederland, Texas, east of Houston, breaking the 1950-2017 record for state maximum precipitation from tropical cyclones and their remnants. Hurricane Harvey’s average accumulated rainfall over Houston (840 mm or 33 inches) was exceptional.  There was so much floodwater in Houston that it sunk the landscape by 2 centimeters (~0.8 inch) in some places.  Assuming the precipitation area of individual hurricanes remains similar, Kerry Emanuel found that the return period for a greater than 500 mm (19.7 inches) average accumulated event precipitation was a once in a 100-year event over 1981-2000.  This becomes a once in 16-years event by 2017 and a once in 5.5-years occurrence by the end of this century under an unabated emissions scenario.

Catastrophic Wind –   A hurricane category is defined by sustained winds and the associated consequences are described by words such as “devastating damage” for category 3 and “catastrophic damage” for categories 4 and 5.   Hurricanes Irma and Maria had unusually high peak 1-min sustained winds making the rank of North Atlantic Hurricanes with the strongest winds in the historical record (see table).  Those on the ground during landfall withstood ferocious winds.  Hurricane Maria was the first category 5 (157 miles per hour or higher sustained winds) hurricane to make landfall in Dominica—a small Caribbean Island south east of Puerto Rico. It made landfall yet again, but this time as a category 4 (130-156 miles per hour sustained winds), in Puerto Rico.  Similarly, hurricanes Harvey and Irma made landfall as a category 4 storm in Texas and Florida, respectively.

 

How does an abnormal hurricane season become disastrous?

The Intergovernmental Panel on Climate Change has pointed to three major factors that can combine to influence the risk of an extreme event disaster.  A weather and climate event plus communities exposed to that event plus the social vulnerability all combine to influence disaster risk.

Social vulnerability refers to the resilience of communities when confronted by external stresses.  A few examples follow regarding the exposure and social vulnerability that intersected with hurricanes that are changing in a warming world.

Many Caribbean residents were among those exposed to these powerful hurricanes, which made repeated landfall on numerous islands this year.

For over three centuries people have lived on Barbuda and for the first time the risk was so grave that the entire island population fled to avoid exposure to Hurricane Irma.  People are confronting rebuilding a community and civilization on Barbuda.

It is estimated that 3.7 million people, over a million households, nearly 260 billion dollars in Puerto Rico were exposed to wind impacts from Hurricane Maria. The fourth most populated U.S. City in 2017, Houston, was exposed to the precipitation deluge (see Figure 3) from Hurricane Harvey.

An entire metropolitan region, island or county might be exposed to an abnormal hurricane season, but not all of those in the path of a storm are equally vulnerable to its effects.

Differences in vulnerability have already emerged in the aftermath of the 2017 season that reflect in part the history of a place, people, and infrastructure.  Additional factors include communication about the hurricane risks, first response and long-term disaster management. For example, elderly people perished in a Florida nursing home after days of being stuck in sweltering heat following the power outage caused by Hurricane Irma.

The U.S. Health Assessment found that people with chronic medical conditions are more likely to have serious health problem during excessive heat than healthy people.   The elderly in this case depended on others for their care.  As the USA Today Editorial Board put it “In such a climate, air conditioning is not a luxury for elderly people; it’s a necessity.”

Tragic loss of life from Hurricane Maria in Puerto Rico is estimated to be similar to Hurricane Katrina. This large toll is in part due to the vast numbers of  U.S. citizens and residents  still suffering from lack of safe drinking water or do not have access to power in Puerto Rico.

Families are piecing together their lives after a devastating loss of a family member. Or the absence of a child who had to evacuate to continue school in a safer place during a protracted recovery period.

2017 is the most expensive Atlantic Hurricane Season to date with damages already racking up over $200 billion.   The epic disasters of the 2017 hurricane season hold important lessons, which should be taken into account when planning steps to better protect lives from Hurricanes and their aftermath.  In turn those who are recovering from the disastrous hurricane season can learn from those lessons already learned from Hurricanes Sandy, Katrina, and Andrew.

These lessons can help communities rebuild toward climate resilience with principles that are scientifically sound, socially just, fiscally sensible, and adequately ambitious.

NOAA Climate.gov NOAA National Weather Service NOAA tweet http://bit.ly/2AkUySt Brenda Ekwurzel created with NASA, U.S. Air National Guard photo by Staff Sgt. D.J. Martinez, U.S. Air Force, and U.S. Dept of Homeland Security images

Virginia’s Gerrymander Is Still Alive—and a Deadly Threat to Environmental Justice

This week, Virginia’s Board of Elections certified results from the November 7th elections, paving the way for three crucial recounts that will determine control of the Virginia House. The Democratic Party would need to take two of those seats for a majority, having already defeated more than a dozen incumbent Republicans and flipping three seats. If this wave is enough to push the Democratic Party over the 50-seat mark, many in the press will declare that the Virginia GOP’s gerrymandered districting plan is no more. But they will be wrong. The value of some Virginians’ votes are still diluted, as they were before the election. In turn, voting inequalities continue to bias the legislature’s responsiveness to environmental and health threats.

Virginia’s gerrymander has proven durable over the decade. Majorities of voters have supported the Democratic Party over the last four election cycles, only to win about one third of legislative seats. This bulwark against majority rule was engineered after the 2010 Census, by an incumbent party with absolute control over redistricting the assembly. Despite earning a substantial (nine-percent) majority of votes over incumbent Republicans this year, Democrats still have less than a 50/50 chance of gaining majority control, and if they do it will be by one seat. The fact that there is any uncertainty over whether a party with a near 10-point majority vote will control the chamber is proof of just how durable the gerrymander is. What happened on November 7th in Virginia was near historic, but it did not breach the gerrymander.

2017 Democratic district vote shares (blue), sorted by 2015 Republican vote shares (red). Democratic vote shares in 2015 uncontested GOP districts are sorted by 2017 Democratic vote share.

Democratic voters wasted far more votes in uncontested safe districts, 26 in fact, compared to 11 overwhelmingly Republican districts where Democrats did not field candidates. This is illustrated in the graphic below with full blue bars (left), indicating uncontested Democratic seats, and bars that are filled red with no blue, uncontested Republican seats.  While Democrats tend to reside in higher density, urban regions, one of the most powerful gerrymandering tactics is to pack opposition voters into districts so that their surplus votes (over 50%) are wasted. This year, extensive mobilization efforts, coupled with a Gubernatorial campaign tainted with racist overtones, provided the bump that Democrats needed in the most competitive districts (around the 50% mark). The middle of the graph depicts the contests where Democrats reached 50% or higher, reaching into the competitive districts held by GOP incumbents (and several open seats).

In districts that were contested in both cycles, Democratic candidates gained an average of 9.6 points (with a 5-point standard deviation). Democrats also contested far more districts than in 2015 (the solid red area with blue bars), picking off several seats against incumbents where they had not previously fielded candidates. Had the wave reached into districts where Republicans typically win by 15-20 points, we would have seen the type of gerrymander backfiring that occurred in Congress in the late 1800’s. In 1894, for example, a vote shift of less than 10 points against the Democratic Party cost them more than 50% of their seats, the largest loss in Congressional history.

The Democratic wave was enough to sweep away the GOP’s supermajority, but not enough to reverse the tides. Unless the Democratic Party can repeat their impressive turnout effort in 2019, it will be impossible to hold on to those marginal seats. Of course, under a fair system, a party with a nine-point statewide lead would have a cushion of several seats for close legislative votes. Even if Democrats do gain control, that one seat majority is vulnerable to being picked apart by the same powerful actors that helped engineer this electoral malpractice in the first place, at a great cost to Virginians.

Probably the single most powerful player is Dominion Energy. Consistently one of the largest donors to state election campaigns, Dominion greatly benefitted from a gerrymander engineered in large part by one of its biggest supporters, Appropriations Chair S. Chris Jones. Since 2011, Dominion has been remarkably successful at pushing through a rate freeze law that allowed it to hold on to over $100 million it would have paid back to customers, limiting the growth of clean energy technologies like solar power, and avoiding regulatory oversight of the toxic pollutants that it dumps into Virginia waterways. Remarkable enough that several of the successful Democratic challengers in this election made Dominion’s political influence central to their campaigns, refusing to accept their contributions.

The Dominion rate freeze passed the VA House on a 72-24 vote, so it’s not clear that even a fair districting plan would have stopped it, but it definitely would have changed the terms of negotiation. And because it has still insulated the legislature from an accurate representation of public support, the Virginia gerrymander weakens voters’ ability to protect themselves against current and impending health threats. For example, measured by the amount of toxic chemicals discharged into them, Virginia’s waterways are among the worst in the nation. Hundreds of companies are allowed to legally discharge toxins into waters upstream from recreational places where people regularly swim and fish. Arsenic levels up to 400 times greater than what is safe for residential soil have been measured along the James River.

Dan River coal ash spill. Photo: Appalachian Voices

According to a University of Richmond study, eight coal ash disposal sites along major rivers are significant hazards to nearby communities. Yet Virginia’s legislative oversight and regulatory programs are “bare boned and fragmented”, with utilities failing to provide adequate information about the amount, condition and stability of toxic chemicals and containment.

Nor do Virginians bear this burden equally. 76 percent of Virginia’s coal-fired plants are located in low-income communities or communities of color, including Possum Point, Spruance Genco and the Clover Power Station. Cumulative chemical exposure in such communities increases the risk of cancer, lung, and neurological diseases. The cancer rate in rural Appalachian Virginia is 15% higher than the national average, reflecting both environmental threats and lack of access to health care.  Earlier this year, an effort to expand Medicaid was killed on a party-line vote.

And as the impact of climate change becomes more pronounced, Virginia is on the front lines. A UCS analysis of the impact of tidal flooding showed that cities like Norfolk could see four times the frequency of flooding by 2030, while they already spend $6 million a year on road improvement, drainage and raising buildings. In places like Hampton Roads, sea level has already risen by more than a foot over the last 80 years. Yet members of the Virginia House, entrenched in power, continue to deny even the existence of sea level rise. Unfortunately, even a gerrymander as durable as Virginia’s cannot stop actual rising tides.

For their own safety, and the future of the Commonwealth, Virginians must continue the fight to have their full voting rights restored. Many are already suffering, and many more will pay a heavy price for policies that are unresponsive to public needs. Political equality and the integrity of the electoral process are prerequisites to evidence-based policy making that is in the public interest.

More Electric Vehicle Infrastructure Coming to Massachusetts

Massachusetts Department of Public Utilities today approved a proposed $45 million investment in electric vehicle charging infrastructure.

The investments in electric vehicle infrastructure come as part of a complicated rate case that involves a number of important issues related to rate design, energy efficiency and solar energy. But at least on the electric vehicle part, the utilities and the DPU got it right.

Why do we need more investments in electric vehicle infrastructure?

Electric vehicles are a critical part of Massachusetts’ climate and transportation future. Under Massachusetts’ signature climate law, the Global Warming Solutions Act, the state is legally required to reduce our emissions of global warming pollution by 80 percent by 2050.

Transportation is the largest source of pollution in Massachusetts, and it’s the one area of our economy where emissions have actually grown since 1990. Achieving our climate limits will require the near-complete transition of our vehicle fleet to electric vehicles or other zero-emission vehicle technologies.

The good news is electric vehicles are here, they are fun to drive and cheap to charge, and when plugged in to the relatively clean New England grid, they get the emissions equivalent of a 100 mpg conventional vehicle. EV drivers in the Boston area can save over $500 per year in reduced fuel costs. Electric vehicle technology has advanced to the point where mainstream automakers and countries like China and France are now openly talking about the end of internal combustion engine.

But while the future for EVs is bright, electric vehicles are still a very small share of the overall vehicle fleet. Nationally, EVs represent less than half of one percent of new vehicle sales. In 2012, Massachusetts committed to a goal of putting 300,000 electric vehicles on the road by 2025. Five years later, we are still about 288,000 EV sales short of that goal.

What investments are coming?

One of the biggest challenges facing the growth of electric vehicles is limited infrastructure. People are not going to buy an EV if they don’t know where to plug it in. A survey of Northeast residents conducted last year found that limited access to charging infrastructure is one of the biggest obstacles to EV purchases.

We have had over a hundred years – and billions in public subsidies – to build the infrastructure of refineries, pipelines, and gas stations that service the internal combustion engine. New investments in charging infrastructure are critical to making EVs as convenient as filling up at a gas station.

Today’s decision will speed the transition to electric vehicles by making investments in charging infrastructure. These investments include more funding for infrastructure for people who live in apartment buildings, more fast charging infrastructure along highways, and increasing charging infrastructure in low income communities, and greater access to workplace charging.

Overall, the proposal anticipates the construction of 72 fast-charging stations and 3,955 “Level-2” home and workplace charging ports over the next 5 years. Of those charging ports 10 percent will be in low income communities, where utilities will also provide consumers with a rebate for charging stations. These investments will provide thousands of Massachusetts residents with access to EV charging stations.

The DPU did deny Eversource the right to use ratepayer funds for education and outreach. This is unfortunate, as our survey also found that most Northeast residents are not aware of the many incentives available for EV customers, both here in the Northeast and at the federal level.

What more needs to be done?

One big question that is left out of the decision today: how do we best manage EV charging to maximize the potential benefits to the electric grid.

The key issue is when does EV charging take place? If most people charge their EVs at night, or during times of high production of renewable electricity, then the transition to electric vehicles can make our electric system more efficient and speed the transition to renewables. This will mean significant cost savings.

On the other hand, if EV charging mostly happens during “peak” hours (such as morning and early evening), then adding more EVs onto the grid could strain existing electricity infrastructure and require additional investments in pipelines and power plants. This would both raise emissions and cost ratepayers money.

There’s a simple way to address this issue: provide a financial incentive for EV drivers to charge their vehicles during periods of low demand, a policy known as Time of Use Rates. The DPU decision today punts this issue, accepting the utility position that it will take time and additional data to determine how to best implement TOU rates. While we agree with the DPU that the most important priority is to get the charging infrastructure installed, this is an issue that we and others in the clean transportation community will be watching closely over the next few years.

Photo: Steve Fecht/General Motors

Great Lakes’ Great Changes: Temperatures Soar as the Climate Changes

Grand Haven pier extends into Lake Michigan, where average summer surface temperatures have risen markedly over recent decades. Photo: Rachel Kramer/Flickr

Lake Michigan is not yet a hot tub, but the warming of this Great Lake gives you much to sweat about.

In his office at the University of Wisconsin Milwaukee, Paul Roebber, a Distinguished Professor in atmospheric sciences and a former editor of the journal Weather and Forecasting, showed me his most recent climate change lecture slides. The most arresting graphics compare current surface water temperatures of the Great Lakes with those three and a half decades ago. The average summer surface temperatures have risen 8 degrees Fahrenheit since 1980.

Particularly stark was Roebber pointing out a spot where a monitoring buoy floats way out in the middle of 100-mile-wide Lake Michigan, at a latitude between Milwaukee and Chicago. Two decades ago, average mid-July to September surface water temperatures in southern Lake Michigan ranged between 61 and 71 degrees. In 2016, they ranged between 67 and 77 degrees. On three separate days in 2016, temperatures hit 80. Surface water temperature changes near Milwaukee and Chicago were just as remarkable. On August 1, 1992, surface water temperatures were 61 and 65 degrees, respectively. On August 1, 2010, both were in the mid-70s.

“We’re starting to talk bath tub water and that is saying something about the changes,” Roebber said.

The future is almost unthinkable

Roebber’s comments certainly say something to me as a native of Milwaukee. I have vivid memories of childhood winters a half-century ago. We first- and second-graders were so acclimated to consecutive subzero days that when the high was 5 above, we’d walk to school with our coats flying open unzipped.

“We’re starting to talk bath tub water and that is saying something about the changes.” Atmospheric sciences professor Paul Roebber, University of Wisconsin.

Today, scientists predict a future climate unthinkable for a region where Green Bay Packers fans romanticize their home-team advantage in a stadium nicknamed the Frozen Tundra.

Roebber said that the modern lake warming has occurred with a rise of only a single degree in the air temperature over the Great Lakes over the last 30 years. But air temperatures are about to soar in scenarios where little or nothing is done to fight climate change. Researchers all around the Great Lakes and analysts at the Union of Concerned Scientists predict that the average summer highs of Milwaukee, currently about 80 degrees, could rise as high as 92 over this century.

The UCS analysis predicted that by 2100, Milwaukee would have nearly two months’ worth of days 90 degrees or higher, including three weeks’ worth of 100-degree scorchers. There would be at least one heat wave a summer with the sustained oppressive temperatures that killed hundreds of people in Chicago in 1995. Overall air quality would deteriorate as well, exacerbating asthma and other respiratory conditions.

In fact, the Upper Midwest region—including Milwaukee, Chicago, and Minneapolis—could collectively experience regular deadly heat waves with temperatures on the same scale that killed an estimated 70,000 people across Europe in 2003. “Under the higher-emissions scenario a heat wave of this magnitude would occur at least every fifth year by mid-century and every other year toward the end of the century,” the UCS analysis concluded.

 Under worst-case scenarios, northern Illinois will have the climate of Dallas and southern Illinois will have the temperatures of Houston by the end of this century. As for Illinois’ neighbor to the north, Roebber notes, “Our climate in Wisconsin will look like Arkansas.”

Change is underway in the world’s largest surface freshwater system

It’s scary to contemplate what Lake Michigan could be compared to a century from now. The five Great Lakes comprise the world’s largest surface freshwater system, in a basin serving 30 million people. While many long-range projections of climate change along America’s eastern seaboard focus on chronic inundation from rising ocean levels, the lakes offer a different set of perplexing dilemmas.

Perhaps most perplexing is the year-to-year unpredictability of conditions. The general scenario of recent decades has been less ice cover in winter, which has allowed more water to evaporate and resulted in unprecedented low lake levels. But there can also be years where that trend is punctuated by ice-choked Great Lakes as the warming Arctic ironically creates a wavier jet stream.

The overall long-term trends, according to the University of Wisconsin Sea Grant Institute, point to all the bodies of water in the state being at risk.

“Longer, hotter, drier summers and increasing evaporation will result in warmer and shallower rivers, shrinking wetlands, and dried-up streams, flowages and wild rice beds,” the institute says. “Algal blooms will create anoxic conditions for aquatic life in ponds and many lakes.”

“These conditions will reduce the amount of suitable habitat available for trout and other cold-water fishes, amphibians and waterfowl. A two-degree rise in temperature could wipe out half of Wisconsin’s 2,700 trout streams. Hot dry conditions, coupled with more frequent thunderstorms and lightning, will increase the chance of forest fires. Red pine, aspen and spruce trees will disappear from our northern forests.”

A joint report by the University of Wisconsin and the state’s Department of Natural Resources predicts more climate-change losers than winners among fauna. As populations of European starlings, Canada goose, and gray squirrels grow, those of the purple martin, black tern, American marten, common loons, and various species of salamanders, frogs, and prairie birds may decline or disappear.

“This will result in a net loss to the state’s biodiversity and a simplification of our ecological communities,” the report said.

As for commercial activities, Roebber said there may be more ice-free days to allow more winter shipping, but fluctuating lake levels may play havoc with lakeshore-dependent businesses during the rest of the year, from expensive marina dredging operations to beach erosion in resort communities. Water quality may be degraded if low lake levels expose harmful chemicals. An additional wild card is the prospect of Wisconsin facing more weather extremes with heavy rains and floods dancing with more frequent short-term droughts.

“It’s not clear how much lower the lake will go, but the levels will become more variable,” Roebber said.

Sitting on our hands

This month, 13 federal agencies released the government’s latest major assessment that human activities are “the dominant cause” of the warmest period “in the history of modern civilization.” That report predicts a 9.5-degree rise in average temperatures in the Midwest under continued high-emission scenarios, the greatest rise of any region in the contiguous United States.

But it is not clear how much researchers will be able to refine their predictions. The Trump administration, despite approving the release of the congressionally mandated report, is in the midst of an unprecedented attack on climate change research. Climate change experts in the Interior Department have been reassigned. The Environmental Protection Agency has banned some scientists from speaking at climate change conferences. The Trump administration has proposed hundreds of millions of dollars of cuts to NASA and NOAA planetary and weather research that relates to climate change.

The assault is also at the state level. Last year, Wisconsin governor Scott Walker ordered state agencies not to comply with President Obama’s Clean Power Plan and his DNR removed references from its website saying human activities are the root cause. Despite its prior partnering with university researchers, the DNR currently says, “The earth is going through a change. The reasons for this change at this particular time in the earth’s long history are being debated and researched by academic entities outside the Wisconsin Department of Natural Resources.”

In this environment, exacerbated by years of prior Congressional budget cuts that constrict the chances of winning federal research grants, Roebber fears for the further erosion of the nation’s ability to protect lives and livelihoods with science.

Destructive weather events are virtually certain to increase. A report this fall by the Universal Ecological Fund calculates that weather events that currently cost the US $240 billion a year will increase to $360 billion annually over the next decade, the latter cost being equal to 55 percent of the current growth of the US economy.

“Facts used to be something we used to solve difficult things and innovate,” Roebber said. “Why the political process is now so destructive to such an important function of society and why the (political) climate has almost become antagonistic toward education is troubling. We’re sitting on our hands instead of accelerating the things we need to do.”

Hyping US Missile Defense Capabilities Could Have Grave Consequences

In response to North Korea’s latest ballistic missile test, which flew higher and farther than any of its previous launches, President Trump told Americans not to worry. “We will take care of it,” he said. “It is a situation that we will handle.”

The big question is how. Unfortunately, Trump’s assertion may rest on his unwarranted confidence in the US missile defense system. During a recent interview with Fox News host Sean Hannity about the threat posed by a potential North Korean nuclear strike, he declared that the United States has “missiles that can knock out a missile in the air 97 percent of the time.”

The facts, however, tell a different story.

The reality is that the US Ground-based Midcourse Defense (GMD) system has succeeded in destroying a mock enemy missile in only 56 percent of its tests since 1999. And, as I’ll explain, none of the tests approached the complexity of a real-world nuclear launch.

What’s more, ever since the George W. Bush administration, the GMD program has been exempt from routine Pentagon oversight and accountability procedures. The result? Fifteen years later, all available evidence indicates that it is still not ready for prime time, and may never be.

Of course, Trump is prone to exaggeration. In fact, he has averaged more than five lies per day since taking office. But it is critical to understand the potential ramifications of this particular Trumparian boast: It could lull Americans into a false sense of security and, even more alarming, embolden Trump to start a war. As veteran military reporter Fred Kaplan pointed out, if the president truly believes the US missile defense system is infallible, “he might think that he could attack North Korea with impunity. After all, if the North Koreans retaliated by firing their nuclear missiles back at us or our allies, we could shoot them down.”

Such wishful thinking could clearly lead to a disastrous miscalculation. And what’s worse, Trump just may believe his preposterous claim because he’s not the only one making it.

If You Repeat a Lie Often Enough…

Missile defense advocates have a long history of hyperbole. A 2016 report by the Union of Concerned Scientists included an appendix with a selected list of some three dozen statements administration and military officials have made extolling the GMD system’s virtues. They are incredibly consistent, and given the facts, consistently incredible.

In March 2003 — before the GMD system was even deployed — then-Undersecretary of Defense Edward Aldridge assured the Senate Armed Services Committee that its “effectiveness is in the 90 percent success range” when asked if it would protect Americans from the nascent North Korean threat.

Seven years later, in December 2010, then-Missile Defense Agency Director Lt. Gen. Patrick O’Reilly told the House Armed Services Committee’s strategic forces subcommittee that “the probability will be well over in the high 90s today of the GMD system being able to intercept” an Iranian intercontinental ballistic missile (ICBM) targeting New York City.

Fast forward to April 2016, when Brian McKeon, principal deputy undersecretary of defense for policy, testified before the Senate Armed Services Committee’s strategic forces subcommittee. “The US homeland,” he maintained, “is currently protected against potential ICBM attacks from states like North Korea and Iran if it was to develop an ICBM in the future.”

Wrong, wrong, and yet again, wrong. As Washington Post “Fact Checker” columnist Glenn Kessler wrote in mid-October, the claim that the GMD system has a success rate in the “high-90s” is based on “overenthusiastic” math. The system has succeeded only 56 percent of the time over the last two decades, but the calculation is predicated on a hypothetical, never-been-tested launch of four GMD interceptors with a 60-percent success rate producing a 97-percent chance of destroying one incoming ICBM. If one interceptor missed because of a design flaw, however, the other three would likely fail as well. “The odds of success under the most ideal conditions are no better than 50-50,” Kessler concluded, “and likely worse, as documented in detailed government assessments.”

No surprise, defense contractors also wildly overstate the GMD system’s capabilities.

This September on CNBC’s Squawk Box, Leanne Caret, president and CEO of Boeing’s Defense, Space & Security division, stated unequivocally that the GMD system would “keep us safe” from a North Korean attack. The system is “doing exactly what is needed,” Caret said, but added that it will ultimately require even more rocket interceptors from her company, the prime GMD system contractor since 1996. There are currently 40 interceptors in underground silos at Fort Greely in Alaska and four at Vandenberg Air Force Base in Southern California, all made by Boeing.

Raytheon CEO Thomas Kennedy, whose company produces the “kill vehicle” that sits atop Boeing’s interceptor, was equally sanguine about the GMD system when he appeared on Squawk Box the following month. “I say relative to the North Korean threat, you shouldn’t be worried,” Kennedy said. “But you should ensure that you’ve talked to your congressman or congresswoman to make sure they support the defense budget to the point where it can continue to defend the United States and its allies.”

Given such glowing reviews, it’s no wonder President Trump asked Congress for $4 billion for the GMD system and other programs, such as the ship-based Aegis system, designed to intercept short- to intermediate-range missiles. In a November 6 letter to lawmakers, Trump wrote: “This request supports additional efforts to detect, defeat, and defend against any North Korean use of ballistic missiles against the United States, its deployed forces, allies, or partners.”

The House of Representatives apparently is even more enthused about the GMD system’s much-touted capabilities. It passed a $700-billion defense authorization bill on November 14 that includes $12.3 billion for the Missile Defense Agency — more than triple what Trump requested. Some of that money would cover the cost of as many as 28 additional GMD interceptors, but lawmakers asked Defense Secretary Jim Mattis to develop a plan to add 60, which would increase the overall number of interceptors to 104.

Unrealistic, Carefully Scripted Tests

If members of Congress bothered to take a closer look at the GMD system’s track record, they would hopefully realize that committing billions more is throwing good money after bad. Even the most recent test, which the Missile Defense Agency declared a success, would not inspire confidence.

That test, which took place on May 30, resulted in a GMD interceptor knocking a mock enemy warhead out of the sky. At a press conference afterward, then-Missile Defense Agency Director Vice Adm. James Syring claimed it was “exactly the scenario we would expect to occur during an operational engagement.”

Not exactly. Yes, the Pentagon did upgrade its assessment of the GMD system in light of the May exercise, but — like previous tests — it was not held under real-world conditions.

In its 2016 annual report, the Pentagon’s Operational Test and Evaluation office cautioned that the GMD system has only a “limited capability to defend the U.S. homeland from small numbers of simple intermediate range or intercontinental ballistic missile threats launched from North Korea or Iran.” The “reliability and availability of the operational [interceptors],” it added, “are low.” After the May test, however, the office issued a memo stating that “GMD has demonstrated capability to defend the US homeland from a small number of intermediate-range or intercontinental missile threats with simple countermeasures.”

Despite this rosier appraisal, Laura Grego, a Union of Concerned Scientists (UCS) physicist who has written extensively about the GMD system, is not convinced that the latest test represents a significant improvement. After analyzing an unclassified Missile Defense Agency video of the May 30 exercise, she concluded that it was clearly “scripted to succeed.”

As in previous tests, system operators knew approximately when and where the mock enemy missile would be launched, its expected trajectory, and what it would look like to sensors, she said. And, like the previous tests, the one in May pitted one GMD interceptor against a single missile that was slower than an ICBM that could reach the continental United States, without realistic decoys or other countermeasures that could foil US defenses.

The key takeaway? The GMD system has destroyed its target in only four of 10 tests since it was fielded in 2004, even though all of the tests were held under improbably ideal conditions. If the tests had been more realistic, the deployed GMD system likely would be zero for 10. Moreover, the system’s record has not improved over time. Indeed, it flunked three of the four tests preceding the one in May, and not because the Missile Defense Agency made the tests progressively more difficult.

According to the 2016 UCS report Grego co-authored, a primary reason for the GMD system’s reliability problems is not funding, but lack of oversight. In its rush to get the system up and running, the George W. Bush administration exempted the program from standard military procurement rules and testing protocols. That ill-advised decision has not only run up the system’s price tag, which to date amounts to more than $40 billion, it also has produced a system that is incapable of defending the United States from a limited nuclear attack.

“Regardless of what President Trump and other missile defense boosters want us to believe, the data show that we can’t count on the current system to protect us,” said Grego. “We need to reduce the risk of a crisis escalating out of control. Only diplomacy has a realistic chance of doing that.”

Photo: Department of Defense

You Might Be Wasting Food, Even If You’re Not Throwing It Away

Biofuels, if grown and processed correctly, can help contribute to emissions reductions.

When I was a child, I was often told not to waste food. Phrases like “Clean your plate or no dessert,” and “Just cut out that little spot. It’s a perfectly good banana,” and “Don’t put that in the back of the fridge. It’ll spoil and then we’ll have to throw it out.”

Now, half a century later, food waste has grown from family stories into a worldwide policy issue. A common estimate is that 40% of food is wasted. Scientific papers analyze consumers’ feelings about the sensory and social qualities of meals, and reducing waste is becoming just as much a concern as local, organic, and community-supported. This issue is critical. Yet an important part of the food waste problem remains unseen.

This additional waste involves not the food that is thrown out because no one eats it—but the food we do eat.

Recent studies by an international group of researchers led by Peter Alexander of the University of Edinburgh have shown just how important this additional kind of waste is. Alexander and his colleagues have published a series of papers that give detailed, quantitative analyses of the global flows of food, from field to fork and on into the garbage can. The results are striking. Only 25% of harvested food, by weight, is consumed by people. (Measuring food by its energy values in calories or by the amount of protein it contains, rather than by its dry weight, does increase the numbers but only a bit—to 32% and 29% respectively.)

But beyond these overall figures, Alexander and colleagues point to the importance of two kinds of waste in the ways in which we do eat our food, but in an extremely inefficient way. One is termed “over-consumption,” defined as food consumption in excess of nutritional requirements. (For the purposes of this discussion, I am referring to food consumption in excess of caloric requirements. However, it is critical to note that calories consumed only tells a small part of the story. A complete analysis would include the quality of the foods consumed and the many systemic reasons why we “over-consume”—including the structure of the food industry, the affordability of and access to processed foods relative to healthier foods, etc. But that is the subject for several books, not one blog post.)

Even using a generous definition of how much food humans require—e.g. 2342 kcals/person/day, compared to the 2100 kcal used in other studies—Alexander et al. find that over-consumption is at least comparable in size to the amount of food that consumers throw out (“consumer waste”). This is show in the graphic below, in which in each column, the uppermost part of each bar (in dark purple) represents over-consumption and the second-to-the-top section (light purple) shows consumer waste.

Losses of harvested crops at different stages of the global food system. The four columns represent different ways to measure the amount of food: from left to right, by dry weight, calories, protein, and wet weight. Source: Figure 4 of Alexander et al., 2017, Agricultural Systems; DOI: 10.1016/j.agsy.2017.01.014.

So, it turns out that for many people, reducing consumption could improve health while also potentially saving food and therefore also the many resources that go into growing and distributing it.

But neither overconsumption nor consumer waste are the largest way we waste the resources that can be used to produce food. That turns out to be livestock production—the dark red sections in the graphic above. Livestock are an extremely inefficient way of transforming crops (which they use as feed) into food for humans, with loss rates ranging from 82% (in terms of protein) up to 94% (by dry weight) once all of the feed they consume during their lifespans is considered. It’s not food that goes into our garbage or landfills, but it represents an enormous loss to the potential global supply of food for people just the same.

The reasons have to do with ecology: when we eat one level higher on the food web we’re losing about 90% of the edible resources from the level below.

Achieving the ultimate goals of reducing food waste—for example, reduced environmental consequences and ensuring more people have access to foods that meet their nutritional requirements—of course will require additional and critical steps. For example, additional food doesn’t help if it isn’t nutritious or can’t be accessed by the people who need it. Also, spared land doesn’t help if that land isn’t managed in a way that contributes to a healthier environment. However, thinking more about all types of food waste can help us to find better ways to protect our natural resources while producing and distributing healthy food for all.

The results of these new analyses should expand what we think of when we hear the words “food waste.” Yes, it includes the food we buy but don’t eat—the vegetables we leave on our plates and the bananas we throw into the compost bin—and it’s very important to develop habits and policies to reduce this waste. But we also need to confront the wastefulness in what we do eat, by asking: how much and what kind of food should we be buying in the first place?

Climate Summit Makes Progress Despite Trump, But Much More Urgency Is Needed

The Fijian COP23 presidency placed this sea-faring canoe outside of the main plenary hall in Bonn, symbolizing that when it comes to climate change, we are all in the same boat. Photo: By the author.

As the 23rd meeting of the Conference of the Parties (COP23) to the United Nations Framework Convention on Climate Change—or the annual UN climate talks—opened in Bonn, Germany on November 6, the urgency for much greater action on climate change could not have been more clear.  Just two days earlier, Typhoon Damrey barreled into Vietnam, resulting in 69 deaths and nearly $1 billion in damages.  The storm was the worst to hit the southern coastal region of Vietnam in decades, and came on the heels of Hurricanes Harvey, Irma, and Maria, which devastated communities in Texas, Florida, Puerto Rico, and several Caribbean islands; as well as raging forest fires in western North America and Brazil; heatwaves in Europe; and floods in Bangladesh, India, and Nepal.

The week before COP23 started, the United Nations Environment Program released its annual Emissions Gap Report, which found that the global warming emission reduction commitments put forward by countries under the Paris Agreement “cover only approximately one-third of the emissions reductions needed to be on a least cost pathway for the goal of staying well below 2°C.”

The report said that current commitments make a temperature increase of at least 3oC above pre-industrial levels by 2100 very likely, and if this emissions gap is not closed by 2030, it is extremely unlikely that the goal of holding global warming to well below 2°C can still be reached.  The report’s warning was reinforced by analysis released by the Global Carbon Project during the talks, projecting that after three years in which global CO2 emissions have remained flat, they are likely to increase by 2% in 2017.

The UNEP report contains good news as well, outlining practical ways to slash emissions in the agriculture, buildings, energy, forestry, industry and transport sectors, along with actions to control hydrofluorocarbons and other high-potency greenhouse gases.  The report finds that nominal investments in these sectors could help to avoid up to 36 GtCO2e per year by 2030.  Almost two-thirds of this potential is from investment in solar and wind energy, efficient appliances, efficient passenger cars, afforestation and stopping deforestation — actions which have modest or net-negative costs; these savings alone would put the world well on track to hitting the 2oC target.

In the context of these risks and opportunities, the progress made at COP23 was far too modest compared to what is needed.  But negotiators did succeed in laying the groundwork for more substantial achievements down the road, and the fact that countries pushed ahead despite President Trump’s announced intention to withdraw the United States from the Paris Agreement is in itself a welcome accomplishment.

Getting the rules right

A major focus of the negotiations in Bonn was on hammering out the detailed rules (or “implementation guidelines”) for the Paris Agreement, on a range of issues including transparency and reporting, accounting standards for both emissions and finance, the new market mechanisms created in the agreement that would allow reductions achieved in one country to be credited against another country’s emissions reduction commitments, how to raise the ambition of national actions over time, and actions needed to cope with the mounting impacts of climate change.

Countries had set a goal in Paris of resolving these and other implementation issues at the 2018 climate summit in Poland next December, so there was no expectation of final agreements on any of these issues at COP23.  Rather, the objective at COP23 was to narrow the differences amongst countries and to clearly frame the options on the key issues involved, so as to facilitate their resolution next year.

Progress was made across the range of rulebook topics, but it was uneven.  A bright spot was on the sensitive issue of transparency and reporting, where differences were narrowed and a fairly clear set of options was laid out.

By contrast, the negotiations on “features” of the “nationally-determined contributions” that countries are required to put forward under the Paris Agreement, as well as accounting standards for these NDCs and the up-front information requirements to ensure their “clarity, transparency, and understanding,” were much more polarized, and the end result was an unwieldy 179-page list of issues and options.

The most charged discussions were around finance, specifically the requirement in Article 9.5 of the Paris Agreement, that every two years developed countries must provide “indicative quantitative and qualitative information” on their future support for developing countries, including, “as available, projected levels of public financial resources to be provided.”  The African Group of countries pushed for more clarity and detail on this projected financial support by developed countries for developing country actions, a move that was strongly opposed by the U.S. and other developed countries.

Developing countries want greater certainty of the financial resources available to them going forward, so they can plan projects accordingly; but developed countries are loathe to make multi-year commitments that they can be held accountable for. This issue will be revisited at the intersessional meeting in Bonn next spring, and then brought to ministers at COP24 in Poland in December, 2018.

We left Bonn not with the draft negotiating text on the Paris rules that some had hoped for, but instead with a set of “informal notes” produced by the co-facilitators of each of the working groups, which capture and organize the proposals put forward by countries.  Much work lies ahead to meet the goal of adopting the full Paris rulebook at COP24, and while negotiators can work out some of the technical details in advance, it will clearly be up to ministers to resolve the political differences on the major crunch issues.

Catalyzing higher ambition

The decision adopted in Paris explicitly acknowledged the substantial shortfall in collective ambition that could keep the world from meeting the aggressive temperature limitation goals embodied in the Paris Agreement, and called for a “facilitative dialogue” at COP24 next year to address ways to close this gap.  Working with last year’s Moroccan COP22 presidency, Fiji put forward its vision of how this process should be conducted, renaming it the “Talanoa dialogue.” As Fiji explains, “Talanoa is a traditional approach used in Fiji and the Pacific to engage in an inclusive, participatory and transparent dialogue; the purpose of Talanoa is to share stories, build empathy and trust.”

This will be a year-long process consisting of a preparatory phase starting in early 2018 and a political phase involving ministers at next year’s climate summit in Poland. The dialogue will be structured around three key questions: “Where are we? Where do we want to go? and How do we get there?”  One major input will be the Special Report of the Intergovernmental Panel on Climate Change examining the impacts of global warming of 1.5ºC above pre-industrial levels and related global greenhouse gas emission pathways, scheduled for completion next October.  Additional analytical and policy-relevant inputs will be welcomed in the preparatory phase, not just from countries but from NGOs, businesses, research institutions, and other stakeholders as well.

To succeed, this process must do more than reaffirm the ambition gap; it must spur concrete steps to close it.  A central focus will be on the need for countries to signal, by 2020, their intention to raise the ambition of their existing commitments between now and 2030.  But the dialogue should also examine how states, cities, businesses and other “non-state actors” can contribute to closing the ambition gap, and encourage a range of sectoral initiatives on renewable energy, energy efficiency, forestry and agricultural sectors solutions, carbon pricing and other areas.

The Talanoa dialogue process will be jointly led by Fiji and Poland, as the current and incoming COP presidencies. Given Poland’s heavy dependence on coal-generated electricity, there are legitimate concerns about that government’s interest in generating the specific outcomes from the dialogue needed to enhance ambition.  It is clearly up to all countries to ensure the dialogue stays on track and produces meaningful results.

Dealing with climate impacts

Even if we are able to close the emissions gap and hold temperature increases well below 2 degrees Celsius, as leaders committed to in Paris, the world is going to suffer increasing climate impacts over the next several decades, as a result of the emissions we have already put up in the atmosphere.  Developing countries, together with environmental and development NGOs, pushed in Bonn for faster progress on helping vulnerable countries and affected communities cope with these impacts, both through enhanced measures to adapt to current and future impacts, as well as strategies to deal with the now-unavoidable “loss and damage” they are facing, both from “slow-onset” impacts such as sea level rise and desertification, and from typhoons, hurricanes, floods, and other extreme events.  At COP19 in Poland in 2013, countries established the Warsaw Implementation Mechanism on Loss and Damage (or “WIM”), and explicit provisions on loss and damage were included in the Paris Agreement.

Sadly, not enough was accomplished in Bonn on this front.  Five European countries did pledge a total of $185 million of renewed support for the Adaptation Fund and the Least Developed Countries Fund.  But developed countries blocked a push by vulnerable countries to make the issue of mobilizing the much greater level of financial resources to deal with loss and damage a standing agenda item at future negotiating sessions.  All they would agree to is to hold an “expert dialogue” on this issue at next spring’s subsidiary body meetings in Bonn, which in turn will inform technical analysis on financial resource mobilization for loss and damage activities that is already being undertaken by the WIM.

Expect this issue to continue to be a major topic of debate in the negotiations going forward, including at COP25 in late 2019, where countries have agreed to conduct a full-blown review of the WIM.

The elephant in the room

When President Trump announced in June of this year his intention to withdraw the United States from the Paris Agreement, there was widespread condemnation from other countries, as well as from business and civil society both in the United States and around the world.  Not one country indicated that they intended to follow President Trump out the door; in fact, during the first week of the Bonn climate summit, the only other Paris Agreement holdout, Syria, announced that it intended to join all the other countries of the world in the agreement, rather than be lumped in with the United States as a climate scofflaw.

The U.S. negotiating team in Bonn kept a low profile, hewing largely to past positions on issues like transparency and reporting for developing countries and robust accounting standards.  They were quite tough in the negotiations on climate finance and loss and damage, though, perhaps out of concern that any sign of flexibility risked an unhelpful response from the Tweeter-in-Chief.

White House staff organized a side event on the role of coal, nuclear, and gas technologies as climate solutions, which generated a well-organized and creative protest led by U.S. youth groups.  It was also overshadowed by the launch of the Powering Past Coal Alliance, a coalition of 20 countries led by Canada and the United Kingdom that is committed to phasing out use of coal no later than 2030.

California Governor Jerry Brown, former New York Mayor Michael Bloomberg, and other officials at the Nov. 11th launch of America’s Pledge at the U.S. Climate Action Center in Bonn. Photo: By the author.

But the real energy at the Bonn climate summit came from the We Are Still In initiative of university presidents, mayors, governors, business leaders, and NGOs who showcased their steps to reduce climate pollution and pledged their intention to meet America’s emissions reduction commitments under Paris, regardless of President Trump’s efforts to dismantle federal leadership on climate policy.

Through an intensive schedule of side events, press briefings, and bilateral meetings with ministers and business leaders from other countries, this U.S. subnational delegation went a long way to assuring the rest of the world that President Trump represents a short-term deviation in U.S. policy, not a long-term trend.  Of course, until there is a clear demonstration of bipartisan political support at the federal level for climate action, other countries will understandably continue to harbor concerns about the reliability of the United States as a partner in this endeavor.

What lies ahead

Negotiators will reconvene in Bonn on April 30 for a two-week session of the UNFCCC’s subsidiary bodies, working to make progress across the range of issues to be decided at COP24 in Katowice, Poland next December, and Fiji and Poland will convene several informal ministerial discussions over the course of 2018 focusing on the key political decisions that must be reached at COP24.

There are a number of other events where ministers and even heads of state will be discussing ways to enhance climate action over the next year, including:

  • The One Planet Summit being convened by French President Emmanuel Macron in Paris, with a focus on mobilizing increased public and private sector climate finance.
  • Two more sessions of the Ministerial Meeting on Climate Action (MOCA), a dialogue launched by Canada, China, and the European Union in Montreal in September; the next meeting will be hosted by the EU next spring, followed by a meeting hosted by China next fall.
  • The ninth meeting of the Petersberg Climate Dialogue, a ministerial-level discussion to be co-hosted in mid-2018 by Germany and Poland, as the incoming presidency of the Conference of the Parties.
  • The G7 leaders’ summit, to be hosted by Canada on June 8th and 9th 
  • The Global Climate Action Summit being hosted in San Francisco next September by Gov. Jerry Brown, which will bring together national, state and local political leaders, businesses, scientists, non-profits and others to “showcase the surge of climate action around the world – and make the case that even more needs to be done.”
  • The G20 leaders’ summit, hosted by Argentina and starting just two days before COP 24, on November 30th.  Leaders should build on the Climate and Energy Action Plan adopted at the G20 summit last July under the German presidency, which was agreed to by all G20 countries except for the United States.

All of these events can – and must – contribute to accelerated progress at COP24 in Katowice and beyond in implementing and strengthening the Paris Agreement.  As the UNEP report and other analyses clearly show, we have the solutions we need to the crisis we face. But what we need now is a much greater level of political will.

Which States are Most Energy-Efficient? Here are the Latest Results

Adding insulation to your attic is an effective step to improve the efficiency of your home, save money, and cut carbon emissions.

Autumn makes me think of leaves colored orange and amber and red, of the smell of cinnamon and nutmeg wafting from a range of desserts… and of states vying for top honors in the annual state ranking of energy efficiency policies and progress.

The leaves are mostly done, and the desserts are in my belly. But the latest ranking from the American Council for an Energy-Efficient Economy is out and available, and ready for sampling. It’s always a beautiful sight and a tasty treat.

Energy efficiency – Why and how?

Energy efficiency is already one of the main tools we use for meeting new energy demand. Why it makes sense as a tool is clear, as the new report says:

[Energy efficiency] creates jobs, not only directly for manufacturers and service providers, but also indirectly in other sectors by saving energy and freeing up funds to support the local economy. Efficiency also reduces pollution, strengthens community and grid resilience, promotes equity, and improves health.

The annual scorecard “ranks states on their efficiency policies and programs, not only assessing performance but also documenting best practices and recognizing leadership.” ACEEE does that by looking at a range of metrics that are shaped by each state’s efforts:

  • Utility and public benefits programs and policies
  • Transportation policies
  • Building energy codes and compliance
  • Combined heat and power (CHP) policies
  • State government–led initiatives around energy efficiency
  • Appliance and equipment standards

 

ACEEE state energy efficiency scorecard rankings, 2017

Who’s on top?

The highlighted states include some familiar faces plus a few new ones. The top states were the same in 2017 as in 2016, and highlighted the strong focus on efficiency in certain parts of the country:

  • Massachusetts took the top spot for the seventh straight year, and stood alone at the top (after tying with California for 2016 honors). Northeast states also took third (Rhode Island), fourth (Vermont), sixth (Connecticut), and seventh (New York).
  • The West Coast states garnered high marks, too, taking second (California), fifth (Oregon), and seventh (Washington).
  • The Midwest also made a good showing, at ninth (Minnesota) and eleventh (Illinois and Michigan, tied).

ACEEE makes a point of calling out some “most improved” states, too, and this year that brought in states from other parts of the country:

  • Idaho was the most most improved, jumping up seven spots and landing it in the middle of the pack—its best performance, says ACEEE, since 2012—due to investments in “demand-side management”, increased adoption of electric vehicles, and building energy code improvements.
  • Florida gained three spots in part due to its work on energy efficiency for the state’s farmers.
  • Its work to strengthen building energy codes in the state helped Virginia move up four notches.

The savings add up. (Source: ACEEE state energy efficiency scorecard)

How do states take it to the next level?

No state got a perfect score, ACEEE points out, so every state has room for improvement. Fortunately, they offer a few tips on how to make that happen:

  • Establish and adequately fund an energy efficiency resource standard (EERS) or similar energy savings target.
  • Adopt policies to encourage and strengthen utility programs designed for low-income customers, and work with utilities and regulators to recognize the nonenergy benefits of such programs.
  • Adopt updated, more stringent building energy codes, improve code compliance, and involve efficiency program administrators in code support.
  • Adopt California tailpipe emission standards and set quantitative targets for reducing VMT [vehicle miles travelled].
  • Treat cost-effective and efficient CHP [combined heat and power] as an energy efficiency resource equivalent to other forms of energy efficiency.
  • Expand state-led efforts—and make them visible.
  • Explore and promote innovative financing mechanisms to leverage private capital and lower the up-front costs of energy efficiency measures.

But we’re making progress, and leading states are demonstrating what a powerful resource energy efficiency is.

And with a federal administration that seems determined to move backward on clean air and water by propping up coal, and backward on climate action, that state action on clean energy is more important now than ever.

So congrats to the efficiency leaders among our states, and thanks.

 

Lessons from the Land and Water Songs to Heal

Photo: Samantha Chisholm Hatfield

Recently, I was fortunate to be selected as an HJ Andrews Visiting Scholar, and was able to complete an HJ Andrews Scholar Writing residency, where I had the incredible opportunity to view the forest area through a Traditional Ecological Knowledge lens.

I had scheduled the residency specifically so that I could take my child along, teaching Traditional Knowledge as it has been taught to me, passing along generations of information and skills in areas that had been historically traversed by ancestors. There were times when I doubted my decision, as complaints of spotty wifi access began. That quickly subsided as complaints turned to questions, and I knew I had made the correct decision. Spiritually my child felt it; there was connection again, as I’d hoped.

Photo: Samantha Chisholm Hatfield

My child and I sat at the river’s edge, watching the water roll by. We discussed the water, and the tall trees and the bushes that walked alongside the water’s path. We discussed the tiny bugs skimming around on the water, and the spiders, and the rocks. We joked about how Sasquatch must love this area because of the incredible beauty. Time stopped, and the symphony of wind and water rose around us as we watched branches and flowers dance and sway.

At one point my child broke out in traditional song. To most, this would not seem unusual, but to those who live traditionally, this is spectacular. It was song that came to him, gifted through, and from the waters, about the water and the beauty he found. The water ran clean, and the birds sang freely.

This is who we ARE. As Native People, we are living WITH the land, rather than simply ON it. We engage with the tiniest of tiny, as well as with the largest of large. This is a concept that many cannot fathom. Reciprocity with the land is at the core of where we come from, and has been a basis for our survival as well as our identity. It has been essential that we as Native people continue to nurture the land as it nurtures us. Reciprocity is in traditional information, and is an everyday integrated expectation, that fosters well-being of ourselves and our identification as Natives.

Reciprocity with the land

Photo: Samantha Chisholm Hatfield

Our identity is connected with every tiny droplet. Every tiny speck of dust. Every rock, every tree, every winged, every insect, and four-legged. We are one among many, we do not have dominion over, but rather have congruence with.

It is not vital that we share the same communication language, it is not vital that we appear in the same form. The tiny fly deserves as much respect as the bison, or the person standing next to me. Those of us who work to protect have been given orders to do so, often by our Elders, who are at the forefront of holding our wisdom. Oral histories and Traditional Knowledges hold information and instructions that direct and guide us. There is a belief that we are entrusted to care for the earth, and for the seventh generation to come, so that life, and the earth, will remain just as it is currently, if not better for our future generations.

We are borrowing the resources that we live with, caring for the investment of life that we are blessed with. We are taught to have forward-thinking vision in our actions. We work for all, even for those who are antagonists. We do so, because we have been gifted visions by our ancestors of what Seven Generations means, and what it takes to get there. Vision, of how to care of a world that is quickly losing its grip on reality of situations that are dominating, destructing, and devaluing knowledge. Vision, of what needs repaired, who needs helped, and what path needs to be walked.

Respecting how much Traditional Knowledges can teach us

Many question the validity of TEK, and are not be able to ‘connect the dots’. It is difficult to view a system in an alternative perspective if you have not have grown up in it, nor have been enculturated to it. It can seem foreign and be discounted as baseless. Western mainstream promotes the “dominion over” ideology. Controlling and manipulating that which would challenge or hinder human desires. Reciprocity and gentleness are values taught and held in high esteem in many Native communities.

There are no separations from the environment and ourselves, it is a knowing that what befalls the land, befalls The People.

There are no escape diversions, no malls to buy excuses from, no spas to run to for the weekend.

Our escapes come in the form of clear streams, and old growth towering majestically, in the form of waves crashing on shores and dirt under our feet. We are guided alongside teachings of congregations of the finned, and the winged, the hooved, and the crawlers. Our songs, our prayers, our way of life depends on these aspects, but only when they are connected, and healthy.

Half a book, half a lesson, half a river, half a tree, half a story cannot teach. It cannot sustain culture, it cannot sustain life. Anyone’s.

The integration of knowledge is often viewed as an interloper, incongruent and irrelevant to the daily lives of westernized systems of thought. This could not be further from the truth.

 

Dr. Samantha Chisholm Hatfield is an enrolled member of the Confederated Tribes of Siletz Indians, from the Tututni Band, and is also Cherokee. She earned a doctorate from Oregon State University in Environmental Sciences focusing on Traditional Ecological Knowledge (TEK) of Siletz Tribal Members, from Oregon State University. Dr. Chisholm Hatfield’s specializations include: Indigenous TEK, tribal adaptations due to climate change, and Native culture issues. She’s worked with Oregon Climate Change Research Institute, and successfully completed a Post-Doctoral Research position with Northwest Climate Science Center. She’s spoken on the national level such as the First Stewards Symposium, National Congress of American Indians, Northwest Climate Conference, and webinars. She’s helped coordinate tribal participation for the Northwest Climate Science Center and Oregon State’s Climate Boot Camp workshops. Her dissertation has been heralded nationally by scholars as a template for TEK research, and remains a staple conversation item for academics and at workshops. She is a Native American Longhouse Advisory Board member at Oregon State University, was selected as an H.J. Andrews Forest Visiting Scholar, is actively learning Tolowa, Korean, and continues her traditional cultural practices. In her spare time she dances traditionally at pow wows, spends time with family, and is the owner of a non-profit organization that teaches the game of lacrosse to disadvantaged youth.    

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

 

Pages