Combined UCS Blogs

Driving to a Stable Climate: The Pathway to Reducing Emissions From Transportation

UCS Blog - The Equation (text only) -

The Paris climate targets require greatly reducing US global warming emissions from all sectors of the economy to reach or even surpass net zero emissions. Reducing emissions from transportation is critical to this goal since the United States transportation sector has now become a larger source of carbon dioxide emissions than electricity generation for the first time in decades. In 2014, the EPA estimated that 31 percent of all global warming emissions came from the transportation sector. Currently, the transportation system—from cars and trucks to airplanes and ships—runs almost exclusively on petroleum.  And because we burn petroleum in millions of moving vehicles, there is little opportunity for a technology fix that can capture the carbon dioxide that is produced during combustion. Therefore, achieving our 2050 climate targets will mean changing much of our transportation system from the burning of petroleum to cleaner, renewable sources of energy that have much lower emissions. Fortunately, many of the solutions to reducing emissions are already known and being put into use today.

transportation-ghg

US global warming emissions from transportation in 2014: Passenger cars and trucks and heavy-duty vehicles are responsible for over 80% of US transportation global warming emissions. (data source: US EPA, Inventory of U.S. Greenhouse Gas Emissions and Sinks: 1990-2014)

The future is here already

For some applications like the personal vehicles many of us drive, we know how to begin this transformation: we can replace yesterday’s technology (gasoline and diesel cars) with electric-drive vehicles.  Both battery electric vehicles and hydrogen fuel cell vehicles can replace many of the vehicles we currently drive and have the potential to be virtually emissions free during use. The emissions from these vehicles depend on the energy source used to generate electricity or hydrogen. Already in much of the United States, recharging a battery electric car results in lower global warming emissions than the most efficient gasoline car. And since electricity generation is trending towards lower emissions, recharging vehicles in the future will be even cleaner. Hydrogen can also be made using renewable sources that reduce or eliminate emissions, including solar and wind power.

Since light-duty vehicles make up over 60 percent of all global warming emissions from transportation, changing our personal cars and trucks to electric drive is critical to any deep decarbonization strategy. One influential report,  “Pathway to deep decarbonization in the United States”, shows that most light-duty vehicles on the road will need to be at least partially electric by 2050. This would require the majority of new cars sold in 2035 to be an electric drive vehicle. This is a significant change from today, but a number of states in the United States are already on this path. California and nine other states have adopted a Zero Emission Vehicle policy that requires automakers to sell electric vehicles, with the goal of 15 percent of all new cars being electrified by 2025.

To achieve deep decarbonization of transportation, gasoline-only light-duty vehicles are almost entirely replaced with electric drive vehicles. Figure from “Pathways to deep decarbonization in the United States. The U.S. report of the Deep Decarbonization Pathways Project of the Sustainable Development Solutions Network and the Institute for Sustainable Development and International Relations.”, 2015.

To achieve deep decarbonization of transportation, gasoline-only light-duty vehicles are almost entirely replaced with electric drive vehicles. Figure from “Pathways to deep decarbonization in the United States. The U.S. report of the Deep Decarbonization Pathways Project of the Sustainable Development Solutions Network and the Institute for Sustainable Development and International Relations.”, 2015.

Larger vehicles such as heavy-duty trucks and buses also will need to electrify, and solutions are beginning to be developed. Outside the Oakland UCS office, I often see one of AC Transit’s hydrogen fuel cell buses drive by. Battery options also are becoming available for both buses and delivery trucks and more information can be found in our recent report, “Delivering Opportunity”.  For some large vehicles, biofuels will likely also be part of the solution. Fuels like biodiesel and cellulosic ethanol can reduce emissions from transportation, if they are produced in a sustainable manner.

Planes, trains, and ships

Aviation, rail, and shipping make up a smaller portion of transportation emissions, though improvements are still important to reaching emissions reduction goals.  Efficiency improvements in aircraft can help, as well as the use of biofuels. Similarly, rail and shipping can also use a combination of biofuels, efficiency, and perhaps hydrogen to reduce energy demand and lower global warming emissions.

Cleaning up combustion is important too

To be consistent with long term climate plans, we must move transportation away from petroleum to the greatest extend possible. However, in the near and medium term, many of our vehicles will still rely on combustion. For this reason, we cannot ignore efforts to make conventional vehicles more efficient and less polluting. This means making sure that current rules on both passenger car and heavy-duty truck efficiency go forward, saving significant emissions and petroleum.

A path to cleaning up transportation

Reducing emissions from transportation will require a wide-scale shift away from petroleum. This change won’t happen overnight and will require new investments in alternatives to using oil, both in vehicle technology and fuel infrastructure. However, the good news is that many of the solutions we’ll need, like battery and fuel cell technology and conventional efficiency improvements, are ready to be put into action.

Public Safety Improvements

UCS Blog - All Things Nuclear (text only) -

Disaster by Design/ Safety by Intent #57

Safety by Intent

Continuing the series initiated with Disaster by Design #47, this commentary describes efforts that yielded public safety improvements. But this commentary approaches the subject from a perspective differing from prior commentaries. While still discussing improvements in public safety, this commentary focuses on safety improvements achieved by the public.

Fig. 1 (

Fig. 1 (click to enlarge) (Source: Union of Concerned Scientists)

Safety Second

More than a decade before I joined UCS, the organization published “Safety Second: A Critical Evaluation of the NRC’s First Decade” in February 1985. (f you cannot find this book in your local library, at Amazon, or on the web, send me an email and I’ll reply with a digital copy.)

Chapter Three of the book, titled “The Public as Adversary,” opened with a quote by NRC Commissioner James K. Asselstine—“It is absolutely amazing, the lengths to which the Commission with go to avoid finding that a party is entitled to a hearing on an issue.” After summarizing several confrontational nuclear plant licensing proceedings, the chapter had a subsection titled “Public Contributions to Safety and Environmental Protection.”

The subsection listed 13 safety improvements at individual nuclear plants achieved by the public’s interventions in licensing proceedings (Fig. 1). Several cases involved the public championing concerns raised by whistleblowers who had voiced these same concerns to owners and/or the NRC without success.

The subsection listed another 13 safety improvements achieved generically across the fleet of operating nuclear plants by the public’s efforts (Fig. 2).

Fig. 2 (click to enlarge) (

Fig. 2 (click to enlarge) (Source: Union of Concerned Scientists)

Considering that “Safety Second” examined only a single decade (1975-1984), 26 safety improvements achieved by the public is impressive—averaging nearly three improvements annually.

Those outcomes become even more admirable when you consider the situation faced by the public intervenors. Many have regular jobs and tackle the mind-numbing technical jargon, endless acronyms, and baffling legal lexicon on their own time and at their own expense. If their efforts prevail, their unselfish efforts return no compensation other than the satisfaction of making their communities safer and more secure.

Safety Improvements since Safety Second

Shortly after I joined UCS in October 1996, Ray Shadis of the Friends of the Coast invited me to serve on a panel at a meeting they planned in Wiscasset, Maine on November 19, 1996, to discuss the recent report issued by the NRC’s Independent Safety Assessment Team (ISAT). The NRC sent the ISAT to Maine Yankee at the request of then-Governor Angus King. The ISAT only examined four of the dozens of safety systems at the plant and chronicled many serious problems in their 70-plus page report before concluding the plant was operating safely. I accepted Ray’s invitation and went to Maine for what was my first public speaking role representing UCS.

The efforts by Friends of the Coast and several other public interest groups in Maine transformed the ISAT’s report into a To Do list of things to fix by the plant’s owner. It was a long and expensive list—in May 1998, the company announced its Board of Directors voted to permanently close Maine Yankee rather than pay for the safety fixes.

Over the years, I have had the pleasure of working with many citizens and representatives of local public interest groups. They consistently reminded me that the American form of democracy works best when it is not a spectator sport. They got off the couch and into the game, even though the game is seldom played on a level field and it’s often hard to discern the officials from the opposing teams. Despite the daunting challenges, they demonstrate absolutely amazing persistence and resilience.

I wish I could acknowledge all the safety improvements achieved by the public the past two decades. Instead, I will cite a small sampling to illustrate the results achievable with persistence and passion.

Paul Gunter

Fig. 3 (

Fig. 3 Paul Gunter (Source: Beyond Nuclear)

Paul Gunter, Director of the Reactor Oversight Project at Beyond Nuclear, probably showed the power of one person by single-handedly derailing a tentative agreement reached between the NRC and plant owners. In the wake of the March 1975 fire at the Browns Ferry Nuclear Plant in Alabama, the NRC adopted regulations intended to better protect against fire hazards. The regulations essentially required that owners step through their plants one room at a time postulating a fire that damaged all equipment and cables inside it. The owners’ evaluations had to show that enough equipment located outside the fire room survived to safely cool the reactor core. If not, owners had to relocate equipment or install additional equipment until it was true for all rooms of the plant.

In the late 1990s, NRC inspectors discovered that most of the nuclear reactors operating in the United States, including those at Browns Ferry, did not meet the fire protection regulations. Instead of using equipment that would not be damaged in a fire, owners took credit for workers racing to the end of burned electrical cables and manually operating equipment damaged by the fire. The regulations permitted such manual actions, but only after being formally reviewed and approved by the NRC. Most of the reactors relied on unapproved manual actions that had taken the agency decades to discover.

Meetings between the NRC staff and industry representatives revealed that hundreds, if not thousands, of exemption requests would have to be submitted to the NRC and approved by the agency for all of the illegal manual actions.

Fearful that it lacked the resources needed to process so many exemption requests, the NRC hatched a plan with industry to rapidly issue a new regulation that would permit manual actions that met certain criteria. Such a regulation would retroactively approve all manual actions satisfying the newly imposed criteria.

Paul attended a public meeting on November 12, 2003, between NRC and industry where the draft regulation was discussed. To say that the criteria in the draft regulation were vague would understate the situation. Basically, the draft language would permit any and all manual actions as long as it was “feasible” they could be successfully performed. The draft language would permit more than a dozen manual actions that had to be completed within 30 minutes as long as a dry run of those steps by someone who had rehearsed them many times was able to simulate taking all the steps in 29 minutes and 59 seconds. The draft language contained zero requirements to ensure that all workers who might someday be required to tackle the task was as fit, fast and rehearsed as Demo Worker.

Paul shot down this trial buffoon with this short statement: “It’s feasible that I could go out of this meeting and go out and become a nuclear engineer. I don’t think that that’s likely, but if offers up the same concerns of your choice of words.”

The NRC discarded the loosey goosey “feasible” criterion almost immediately. The NRC developed definitive criterion that appears in the final regulations. Thanks, Paul!

Fig. 4 (

Fig. 4 George Galatis (Source: TIME Magazine)

George Galatis

The efforts by George Galatis certainly resulted in significant safety improvements at Millstone (CT) and in how the NRC oversees safety. George raised concerns with how spent fuel was being handled at Millstone. When neither the company nor the NRC did anything about the concerns (unless shrugging and ignoring counts), George formally petitioned the NRC to keep the three reactors at Millstone shut down for 60 days—the equivalent of a time-out given to misbehaving children. Millstone’s owner contested the petition, but should have readily accepted it. Once George appeared on the cover of TIME magazine, the reactors were shut down. Unit 1 never restarted. Unit 2 restarted more than three years later. And Unit 3 restarted after more than two years. Thanks, George!

Ann Harris and Curtis Overall

Ann Harris and Curtis Overall worked at the Tennessee Valley Authority’s Watts Bar Nuclear Plant during its construction. While performing their assigned tasks, both found problems they reported to management as required by plant procedures. Both experienced harassment and intimidation—including death threats by phone and mail—for having done their jobs and following procedures. TVA fired both workers, allegedly as part of Reductions in Force and not for having raised safety concerns.

Ann and Curtis filed complaints with the U.S. Department of Labor (DOL) contending that TVA violated the Energy Reorganization Act. The DOL’s Administrative Law Judges ruled in their favor.

Ann and Curtis literally put their jobs, and arguably their lives, on the line for safety. Their efforts resulted in lots of safety improvements at Watts Bar that likely otherwise would not have happened. Thanks, Ann! Thanks, Curtis (posthumously)!

Nuclear Information and Resource Service

The Nuclear Information and Resource Service (NIRS) under the leadership of the late Michael Mariotte demonstrated the might of a small organization teaming with grassroots activists and environmental attorneys when they opposed the proposal by Louisiana Energy Services to build and operate a uranium enrichment plant in Louisiana. NIRS partnered with the Citizens Against Nuclear Trash (CANT) to win one of the nation’s first courtroom verdicts on environmental justice grounds. Thanks, NIRS!

Project on Government Oversight

The 9/11 tragedy questioned whether the nation’s nuclear power plants were adequately protected against sabotage attempts. The NRC held many meetings with plant owners about security vulnerabilities and upgrades to lessen them.

These meetings were closed to the public for legitimate concerns that public discussion of security capabilities and shortcomings could unintentionally aid those planning us harm.

The Project on Government Oversight (POGO) crafted a novel and effective way of putting a spotlight on security problems without also handing the bad guys a blueprint for nuclear nightmares. On September 12, 2002, POGO released “Nuclear Power Plant Security: Voices from Inside the Fences.” POGO interviewed security force personnel at more than a dozen nuclear plants and identified common themes: under-staffed, under-trained, under-equipped, and under-paid defenders.

The post-9/11 security regulations adopted by the NRC contained measures on security officer training and qualifications. In addition, a parallel rulemaking process resulted in a final regulation that limited the working hours of security force personnel as protection against impairment due to fatigue. Thanks, POGO!

Galaxy of Public Stars

These summaries illustrate a small sampling of the many times that efforts by the public resulted in nuclear safety improvements. The efforts of the following individuals and public interest groups could just as easily have been summarized (listed alphabetically):

Jessica Azulay of the Alliance for a Green Economy

Anna Aurilio of the Alliance for Nuclear Accountability

Rochelle Becker, David Weisman, and John Geesman of the Alliance for Nuclear Responsibility

Garry Morgan, Gretel Johnston, and Sandy Kurtz of the Bellefonte Efficiency & Sustainability Team

Linda Gunter, Kevin Kamps, and Cindy Folcker of Beyond Nuclear

Lou Zeller of the Blue Ridge Environmental Defense League

Debbie Grinnell and Sandy Gavutis of the C-10

Diane Turco of the Cape Downwinders

Deb Katz of the Citizens Awareness Network

Keith Gunter, Ethyl Rivera, and Jessie Collins of the Citizens’ Resistance at Fermi-Two

Dan Hirsch of the Committee to Bridge the Gap

Paul Blanch of Connecticut

Nancy Burton of the Connecticut Coalition Against Millstone

Michael Keegan of Don’t Waste Michigan

Howard Lerner of the Environmental Law and Policy Center

Maggie and Arnie Gundersen and Carolina Aronson of Fairewinds Associates

Damon Moglen of the Friends of the Earth

Jim Riccio of Greenpeace

Manna Jo Greene of the Hudson River Sloop Clearwater

Arjun Makhijani and the Institute for Energy and Environmental Research

Pine duBois of the Jones River Watershed Association

Dale Bridenbaugh of MHB Technical Associates

Jane Swanson of the Mothers for Peace

Tom Cochran, Geoff Fettus, and Matthew McKinzie of the Natural Resources Defense Council

Clay Turnbull and Ray Shadis of  New England Coalition on Nuclear Pollution

Mike Mulligan of New Hampshire

Mark Leyse of New York

Jim Warren and Mary MacDowell of the North Carolina Waste Awareness and Reduction Network (NC WARN)

Dave Kraft and Linda Lewison of the Nuclear Energy Information Service

Tim Judson, Mary Olson, and Diane D’Arrigo of the Nuclear Information and Resource Service

Glenn Carroll of Nuclear Watch South

Catherine Thomasson and Chuck Johnson of the Physicians for Social Responsibility

Mary Lampert of Pilgrim Watch

Allison Fisher and Tyson Slocum of Public Citizen’s Energy Program

Paul Gallay and Deborah Brancato of Riverkeeper

Tom Clements of the Savannah River Site Watch (and Savannah native)

Seacoast Anti-Pollution League

Susan Corbett of the Sierra Club’s South Carolina Chapter

Sara Barczak of the Southern Alliance for Clean Energy

Don Safer of the Tennessee Environmental Council

Eric Epstein and Scott Portzline of the Three Mile Island Alert

Norm Cohen of Unplug Salem

Vermont Public Interest Research Group

Marilyn Elie of the Westchester Chapter of the Citizens Awareness Network

Many thanks to many people for many safety improvements!

Disaster by Design

Nuclear safety is all about proper balancing.

Emergency core cooling systems are installed to restore the balance between the heat produced by the reactor core and the heat carried away by cooling water. An imbalance for too long results in reactor core overheating as Fermi Unit 1, Three Mile Island and Fukushima remind us.

Other emergency systems are installed to control the balance between neutrons released by atoms splitting in the reactor core. If this control is lost, the reactor power level can soar to disastrous levels as SL-1 and Chernobyl remind us.

Nuclear safety requires a similar balance between industry and public influence on the NRC. When the public’s thumb gets too heavy on the NRC’s regulatory scale, owners can spend money for measures that do not improve safety. Conversely, when the industry’s thumb tips the scale too much, necessary safety margins can be undercut.

This commentary shows that the public’s efforts have yielded nuclear safety improvements. Past commentaries have chronicled nuclear safety improvements achieved by industry’s efforts and other safety improvements gained through NRC’s efforts. Too much is at stake for all voices not to be heard and all perspectives not to be considered.

—–

UCS’s Disaster by Design/ Safety by Intent series of blog posts is intended to help readers understand how a seemingly unrelated assortment of minor problems can coalesce to cause disaster and how effective defense-in-depth can lessen both the number of pre-existing problems and the chances they team up.

RGGI Review: An Opportunity to Include Environmental Justice in Emissions Market Design

UCS Blog - The Equation (text only) -

Confronting climate change, reducing emissions that warm the planet, and protecting the health of the most vulnerable are increasingly recognized as environmental justice issues.  As the blatant disregard for both drinking water and the welfare of people of Flint, Michigan, recently showed us, communities on the frontlines of environmental pollution are actively demanding redress of environmental inequities. In Massachusetts and elsewhere in the Northeast, discussions around leading multi-state efforts to combat climate change are an opportunity to bring these issues front and center.

 

 The Analysis Group.

CO2 emissions in the RGGI region have decreased impressively since 2009. How have these reductions been distributed among vulnerable communities is also important to understand.

Emissions reductions from burning dirty fossil fuels for electricity production is one front in the struggle to combat environmental injustices where frontline communities are demanding equitable solutions. In the U.S. northeast, the New England states, along with New York, Maryland, and Delaware, a market-based mechanism for reducing power sector emissions has been in effect since 2009. The Regional Greenhouse Gases Initiative (RGGI) is an emissions trading bloc that has succeeded in reducing the regional footprint of carbon pollution from electricity generation.

Cutting emissions of heat-trapping gases through programs like RGGI is really important for public health. But it’s also expected that as emissions of greenhouse gases such as CO2 are reduced, emissions of dangerous co-pollutants like particulate matter (PM) and ozone precursors will follow suit. Reducing co-pollutant emissions is critical to safeguard public health because even short-term exposures to these contaminants can trigger asthma and heart attacks, and worsen other cardiovascular and respiratory conditions.

Regional cap-and-trade emissions markets like RGGI are designed to minimize the cost of utilities’ compliance with environmental regulations compared to what are commonly known as command-and-control regulations. These costs are narrowly defined as operational and investment costs for utilities, for example retrofitting plants with emissions control equipment or switching to lower-emissions fuels.

Some environmental justice advocates are concerned that these macro-economic cost valuations don’t consider localized impacts of emissions trading markets. In the course of compliance, for example, older plants that burn coal can fire more and more often because it’s cheaper to do so, while it may still be possible for utilities to offset those emissions with renewables and comply with their aggregate emissions reductions obligations. Because many of those coal power plants are located in communities where mostly low-income people of color live, they will be more exposed to localized emissions of PM and ozone precursors than other communities farther away from the plants.

The economic valuation of the social cost of carbon (along with the pollutants co-produced as carbon is emitted) also gives credence to environmental justice advocates’ claim that cost valuations as purely operational or capital and maintenance costs ignore the public health costs associated with power plant emissions.

Public engagement towards more equitable carbon and co-pollutant reductions

Environmental justice and carbon market advocates are starting to find common ground in finding ways to reduce emissions in ways that prioritize equity. The RGGI program, under review at the moment, provides a great opportunity for stakeholders of all persuasions to get together and explore how to improve a system that, at least in the aggregate, has delivered emissions reductions.

In Massachusetts (a RGGI participant), the Department of Environmental Protection (DEP) and the Department of Energy Resources (DOER) are sponsoring hearings over the next couple of weeks that will help shape the future of RGGI.

Engaging in this public process is critical to carbon reductions and equity because, in the words of frontline community advocates, the current RGGI program review will “determine whether coal, oil and gas-fired power plants will continue to devastate Massachusetts communities or if the state will meet its climate targets.

In these hearings, state and RGGI officials will describe RGGI and the revenues from the program, while community members will testify about the burden of fossil fuels and other sources of pollution in their communities.

Engagement between carbon policy experts and community members to understand each other’s perspectives is essential to work together towards more equitable emissions reductions outcomes. I encourage all those with a stake in our communities’ health and well-being to get involved and make their voices heard.

Massachusetts meetings information:

 

ExxonMobil and Its Terrible, No Good, Very Bad Week

UCS Blog - The Equation (text only) -

Last week brought stunning news about ExxonMobil’s financial position. First came a warning that the company may be in an “irreversible decline.”

Then the company announced lousy third quarter financial results, sending its stock into a tailspin.

And finally—and perhaps most significantly—ExxonMobil admitted that nearly one-fifth of its oil and gas reserves may no longer be profitable to produce.

Most of the discussion around these developments has focused on low oil prices, which are certainly disrupting the oil business, but I look at ExxonMobil’s future through a very different lens: its climate-related policies and actions. And it’s becoming increasingly clear that business as usual—the unabated extraction and burning of planet-warming fossil fuels—is a risky and dangerous path, not only for the planet, but for ExxonMobil’s own financial future.

I was not surprised to hear that ExxonMobil is falling short of expectations in this area. As the world moves to address climate change, the company must respond to evolving regulations and market forces. And yet our recent analysis found that ExxonMobil earned “poor” scores for failing to fully disclose climate risks or plan for a world free from carbon pollution.

Failing to Plan

As the Paris Climate Agreement takes effect, nations have agreed to a long-term goal of holding “the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5°C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change.”

Today, ExxonMobil issued a statement recognizing the Paris Climate Agreement as “an important step forward” and asserting that the the company “supports the work of the Paris signatories, acknowledges the ambitious goals of this agreement and believes [it] has a constructive role to play in developing solutions.”

Yet ExxonMobil and other fossil fuel companies continue to maintain business models that plan for emissions—and resulting temperature increases—that are far in excess of these international climate goals.

These assumptions are dangerous and must be challenged. Given the world’s commitment to address climate change, it’s imperative that fossil fuel companies now explain how they will change their business models to adapt to a world with fewer carbon emissions, taking into account emissions that result both from their operations and from the use of their products

This year, ExxonMobil faced a shareholder proposal calling on the company to increase capital distributions to shareholders in light of the climate change-related risks of stranded carbon assets (such as oil and gas reserves that become unprofitable to produce).

In opposing the proposal, ExxonMobil said, “The Board is confident that the Company’s robust planning and investment processes adequately contemplate and address climate change related risks. . . Our Outlook [for Energy] by no means represents a ‘business as usual’ case and it includes a significant reduction in projected energy use and greenhouse gas (GHG) emissions due to energy efficiency initiatives. . . The Company also stress tests its oil and natural gas capital investment opportunities. . .”

The company responded similarly to a separate resolution asking ExxonMobil to report on how its business will be affected by worldwide climate policies, citing “robust planning and investment processes” that “ensur[e] the viability of its assets.” ExxonMobil’s board recommended that shareholders vote against the resolution, which had been filed by the New York State Comptroller and Church Commissioners for England. Despite the company’s opposition, the resolution received the highest vote ever (38%) for a climate change proposal at ExxonMobil.

In opposing both shareholder proposals, ExxonMobil touted its 2014 report “Energy and Carbon – Managing the Risks,” which it claimed was an accurate description of how the company integrates climate change risks into planning processes and investment evaluation. The company also assured shareholders “that producing our existing hydrocarbon resources is essential to meeting growing global energy demand.”

What a difference a few months makes!

Carbon-Intensive Reserves

tar-sands-alberta-canadaExxonMobil has significant investments in Canada’s oil sands, where the company is believed to be among the lowest-cost producers. These holdings constitute the vast majority of the company’s reserves that may now be unprofitable to produce, according to a Wall Street Journal report. (InsideClimate News reported last month that oil sands account for 35% of ExxonMobil’s liquid reserves, up from 17% a decade ago.)

In a recent blog post, my colleague Jeremy Martin argued that ExxonMobil and other fossil energy companies need to get smart about which oils to extract in a carbon-constrained world. In particular, they should avoid the dirtiest sources—the tar sands being the best known example.

As Jeremy noted, “In theory, technology may be able to mitigate the additional emissions from these fossil fuel resources, particularly if carbon emissions from the extraction and refining process can be cost effectively captured and safely sequestered. But at the present time, this technology is not commercially available, and its technical and economic prospects are unclear. Until the cost effectiveness of these strategies is clear, these fossil resources are clearly disadvantaged.”

In The Climate Accountability Scorecard, one of the metrics evaluated whether companies have a commitment and mechanism to measure and reduce the carbon intensity of their supply chains. Specifically, we looked for a public commitment not to invest in higher-carbon fuel sources, such as tar sands; or a public commitment to measure and reduce carbon emissions in its own operations.

ExxonMobil has done neither, and scored “poor” on this metric.

“Irreversible Decline”

There has been a 45 percent drop in ExxonMobil’s revenue over the past five years and the company’s stock performance has trailed the S&P 500 for 10 quarters in a row, according to two key metrics cited in Red Flags on ExxonMobil.

Report author Tom Sanzillo, a former New York State deputy comptroller, notes that “Institutional investors face issues not only related to lower shareholder payouts but also involving ExxonMobil’s corporate philosophy and its long-term strategy. Urgent questions raised now by investors require frank and honest answers from the company. ExxonMobil is under considerable financial stress.”

Meanwhile, the Securities and Exchange Commission (SEC) and the New York Attorney General are investigating ExxonMobil’s accounting practices—specifically, whether the company is adequately accounting for the drop in oil prices and the prospect of stronger climate change regulations.

The associated subpoena from the New York Attorney General confirms that his investigation focuses on financial fraud. Last week, the New York Supreme Court ordered ExxonMobil to comply with the subpoena, overruling the company’s claim of “accountant-client privilege.”

Since 2010, the SEC has asked companies to report on material, regulatory, physical, and indirect risks and opportunities related to climate change. In The Climate Accountability Scorecard, UCS examined disclosures of climate-related risks by ExxonMobil and other major fossil fuel companies from January 2015 through May 2016. Our findings highlighted a lack of transparency—something that might be of concern to analysts and investors puzzling over the sudden downturn in ExxonMobil’s fortunes.

The company scored “poor” in the area of “fully disclosing climate risks.” Specifically, ExxonMobil:

  • generally mentions risk associated with current or proposed laws relating to climate change, but does not cite specific laws or regulations
  • acknowledges physical risks it faces and includes some discussion of climate change as a contributor to those risks, but provides few or no details about the nature of those risks, their magnitude, or how they may impact the company
  • identifies competition from renewable energy, changing consumer preferences, and changing technology as risks that it faces, but provides limited analysis of their potential financial impacts
  • provides no disclosure of corporate governance on climate issues

As low oil prices slow the pace of investments in assets like the oil sands, ExxonMobil has a responsibility to support—rather than fight or delay—climate policies that discourage development of these high-carbon reserves.

And shareholders and the public have an opportunity—and a responsibility—to hold the company’s feet to the fire as part of this effort. Photo: Shutterstock

4 Reasons to Vote NO on Florida’s (Anti-) Solar Amendment 1

UCS Blog - The Equation (text only) -

Floridians are making a lot of important choices next week. One that’s not getting quite the same level of attention, but is important anyway, is about a ballot initiative on solar. Anything having to do with solar must be a great thing, right? That’s just what the proponents want us to think. Here are four reasons why Florida voters should reject this anti-solar “solar” proposal:

 Floridians for Solar Choice

Pretty straight forward (Source: Floridians for Solar Choice)

  1. Amendment 1 pretends to be supportive of solar and consumers, but would actually harm it/them/us. Amendment 1 purports to give Florida residents the right to own or lease solar for personal use. But—and there are lots of “buts”—that’s a right they already have, the narrow definition of “lease” would actually make things harder for would-be customers, and it would give local governments the power to severely limit solar adoption, based on some undefined notion of “subsidy”. In short, Amendment 1 adds nothing and takes away plenty.
  2. The pro-amendment money has all come from the utilities and fossil fuel interests—not people. The sources for the money pushing for this ballot (and there’s a lot of it) are really telling. As the Energy and Policy Institute reports, out of more than $26 million contributed toward getting this thing passed, only $305 (that is, $0.000305 million) had come from individual donors (maybe including lots of employees of the pro-amendment organizations). That means 99.9988% from the likes of FPL and Duke Energy, along with the Koch brothers and ExxonMobil. Utilities, at least, aren’t always on the wrong side, but in this case, with Florida being #3 in the country for solar potential but way down at #14 in terms of installed solar, you can bet that they haven’t been helpful in the push to harness the sun. On the opposing side, by contrast, the NO-on-Amendment-1 list is really pretty impressive, and remarkably diverse.

     Energy and Policy Institute)

    Follow the money to see who’s only pretending to be pro-solar (Credit: Energy and Policy Institute)

  3. Amendment 1 proponents know it’s deceitful. A leaked recording of someone from the pro-amendment side showed him admitting that this is basically a hoodwinking based on solar’s good name, a “political jiu-jitsu.” That is, they couched this as a yes-for-solar even though it’s really not. And after the leak the pro-amendment folks tried to cover their tracks after the fact: “Once caught,” writes Sarah Gilliam of the Southern Alliance for Clean Energy, “the sham solar group scrubbed their website and social media channels of any connection to their former ally.” Alas for them, we’re smarter than that (and the internet tends to hold on to things), so the connections are clear.
  4. Amendment 1 hurt Florida’s chances for a real solar amendment. Amendment 1 derailed another proposal that would likely have removed major barriers to solar growth and dramatically increased solar’s attractiveness—and that had support from groups across the political spectrum. Amendment 1 was aimed at muddying the waters to keep the other one from getting the necessary signatures to get on the ballot, and it worked.

Given all the problems with the Amendment 1’s language, it probably shouldn’t even have made it this far. The Florida Supreme Court actually almost killed it in March, allowing it to go forward by only a one-vote margin (4-3). In that decision, the minority was eloquent—and scathing—in its dissent (starting on p. 23), with Justice Barbara Pariente writing for the dissenters:

Let the pro-solar energy consumers beware. Masquerading as a pro-solar energy initiative, this proposed constitutional amendment… actually seeks to constitutionalize the status quo… The ballot title is affirmatively misleading by its focus on “Solar Energy Choice,” when no real choice exists for those who favor expansion of solar energy… This ballot initiative is the proverbial “wolf in sheep’s clothing”… [whose] real purpose… [is] to place a critical restriction on [consumer’s] rights…

Solar is counting on you, Florida

Since the court allowed the initiative to go forward, though, it’s up to you now, Florida voters. The polls are headed in the right direction, with Amendment 1 losing support. But polls aren’t votes.

And votes are voices. As Justice Pariente said in her dissent, “Clearly, this is an amendment geared to ensure nothing changes with respect to the use of solar energy in Florida—it is not a ‘pro-solar’ amendment.”

So don’t just beware, pro-solar energy consumers in Florida. Be engaged. This is your chance to strike a blow for solar in the Sunshine State. Next Tuesday, vote NO on Amendment 1.

  Floridians for Solar Choice Energy and Policy Institute

Dude, Where’s My Car (Charging Station)? How Public Charging Is a Centerpiece of a U.S. Department of Transportation Initiative

UCS Blog - The Equation (text only) -

There are nearly 15,000 public charging stations for electric vehicles in the U.S., but there hasn’t been a great way to navigate to those spots without an app or internet access – until now.

Included in a package of electric vehicle-related initiatives from the White House is the designation of 25,000 miles of U.S. highway as “sign-ready,” meaning they are ready to get uniform signs for electric vehicle charging next to the existing signs for gas, food, and lodging. For now, the placement of the charging signs is limited to highway segments that have charging stations (existing or planned) at least every 50 miles. But this limited placement is not insignificant. 25,000 miles of highway across 35 states already qualify, and with additional investments in public charging coming from states and utilities, tens of thousands more highway miles will likely be getting charging signs in near future as well.

Highways across the country are “sign-ready,” meaning that they will have uniform signage for EV charging stations. See the interactive map at: http://www.fhwa.dot.gov/environment/alternative_fuel_corridors/maps/

I admit, signs for electric vehicle charging is not as exciting as breaking a 108-year championship drought, but this seemingly simple initiative could significantly impact the electric vehicle market.

First, the physical signs for electric vehicle charging could help alleviate range anxiety. Most electric vehicle drivers likely have an app for navigating to charging stations, but perhaps some are part of my T-Mobile family, which means ¼ the cell coverage for ¾ the price. So a charging locator app isn’t necessarily foolproof. If you’re driving along a highway with charging signs, however, no phone service is no problem. Charging signs will be a physical backstop to relying on cell phones or apps, and could give electric vehicle drivers additional peace of mind when embarking on a trip that is further than a single charge.

Second, signs for public charging can instill confidence in non-electric vehicle drivers and raise public awareness toward the existing network of 15,000 electric vehicle charging stations. Just seeing signs for electric vehicle charging every 50 miles along heavily trafficked highways could be the little nudge needed to choose an electric vehicle. Although most electric vehicle charging is done at home, many consumers want to know they can charge on-the-go, and both perceived and actual access to public charging is an important consideration when thinking about whether an electric vehicle can fit your needs.

Lastly, this announcement sends a clear signal to the next Administration that continuing federal initiatives to help electric vehicles gain a better foothold in the U.S. vehicle market should be a priority. Electric vehicles are not just a foil against climate change. They also cut oil use, are cheaper to fuel and maintain, and simply offer a better driving experience. For more on why electric vehicles are smart solution, check out our web content here.

A New Presidency, A New Opportunity for Science

UCS Blog - The Equation (text only) -

This post was originally published on the Science Node.

Throughout its history, the US has benefited by applying science to public policy making. As national challenges become more complex, we rely on the federal government’s use of science to keep us safe and healthy. Science informs the safeguards and standards that protect us—from infectious disease to environmental pollution, from new drug approvals to consumer and worker safety.

The next president has a chance to strengthen the long-standing role science has served in our democracy. I detail how in our newly released recommendations for the next administration.

Our next president can be the science president  iStock

The next president has the opportunity to strengthen federal science for the public good and make scientific integrity part of their legacy. Photo: iStock

The next president will inherit a nation where many Americans have lost faith in government. Evidence of undue political influence over science-based policy decisions and federal scientist survey results suggest that concerns about governmental scientific integrity persist.

But this doubt is also an opportunity. The next president has a chance to create an efficient, effective administration, one built solidly on evidence. The next president has a chance to build public trust in government and to strengthen policies and practices based on science. The next president has a chance to leave a legacy that includes a government with a strong commitment to scientific integrity and an American public with a restored faith in their decision makers.

Today, I am sharing a detailed list of actions the next president can take to reinforce the role of science in federal decision-making:

1. Creating a culture of scientific integrity

The next president can build a culture of scientific integrity throughout the government. This means strong policies and even stronger practices at federal agencies.

Following President Obama’s scientific integrity directive, 23 federal agencies and departments developed scientific integrity policies and designated scientific integrity officials to oversee their implementation.

Yet surveys of federal scientists and journalists show that problems persist. Politics has derailed what by statute should be science-based environmental and public health decisions. Some agency scientific integrity policies are weakly written, while others have not been fully implemented. Further, some government scientists and journalists report an increase in barriers to the free flow of scientific information.

The next president has a chance to curb political interference in science. They can work toward strong policies, consistent practices across agencies, and a firm commitment by government leadership to investing in scientific integrity standards. The next administration should make clear that scientific integrity will be a priority, partner with agencies to bolster policies and practices that support a proper role for science, and affirm that scientists who report losses of scientific integrity are protected from retaliation.

2. Promoting independent science

The processes by which science informs policy are vulnerable to political tampering. As a notice of proposed rulemaking becomes a final agency rule, too often inappropriate interference occurs. Public policies change shape as they make their way through the checks and balances of federal decision-making, of course. But the science informing those decisions should not be altered for political purposes.

The next administration should work to ensure that science-based policies remain free from undue influence throughout the regulatory and policy-making process. The president can work to make sure that all of the scientific advice their administration receives — in the White House, from agency scientists, and through the federal advisory committee system — is accessible, robust, independent, and credible.

The next president has a chance to create the most effective administration possible by implementing the advice of technical experts. Access to scientific information is critical for the best possible outcomes to policy decisions.

3. Increasing government transparency

Public faith in government decisions and the ability of science to inform decision-making are threatened by decisions made behind closed doors. The next president has a chance to improve the transparency of its administration’s actions—an excellent strategy for building public trust.

Recent steps have helped promote openness, including the FOIA Improvement Act and release of White House visitor logs. More can be done.

The public also needs greater access to federal science. Opening it to public scrutiny is an important, inexpensive means of revealing and ending political interference in evidence-based policy decisions.

Greater access can be achieved through better disclosure of regulatory decision-making, wider use of information technology, and more communication channels for agency scientists and researchers to share their expertise.

An open government is the best safeguard against corruption and abuse of power and is a necessary ingredient of democracy.

4. Enhancing public participation

Finally, the next president has a chance to create the most participatory democracy the nation has ever seen. The recent, unprecedented interest in elections can be transformed into an active citizenry that participates in federal decision-making.

The US was founded on the conviction that an informed citizenry, armed with evidence and reason, can make wise decisions that promote public health, safety, and well-being. Yet today, many citizens are excluded from the democratic process by outdated information collection methods and unnecessary institutional barriers.

The next administration should capitalize on technology and innovation to make federal processes for gathering public input more diverse, inclusive, and participatory. In this increasingly noisy information landscape, it is more important now than ever for governments, scientists, and citizens to engage together in our democratic processes to ensure that our policies are informed by the best available science.

For the future of our democracy, I call on the next president to make science a priority.

First Offshore Wind in the Western Hemisphere, Right Off Our Shores. What Does it Mean?

UCS Blog - The Equation (text only) -

Last week I took my son and friends to behold a brand new energy source that has sprung up just off the coast: the first offshore wind turbines in the United States—actually, the first anywhere in the Americas. This is a moment worth savoring, and definitely worth sharing with the next generation.

 A. Kommareddi)

The next generation: Offshore wind, Block Island, and our children at the dawn of a new era (Credit: A. Kommareddi)

Offshore wind power has been a long time in coming. The first US project proposal came 15 years ago. Europe now has 25 years of offshore wind experience under its belt. The project we visited was proposed in 2007.

This is a technology, though, that is worth the wait. Offshore wind resources are powerful—and plentiful. They can be found close to major cities and other places all up and down the coast, where we need the power. And they’re more often available at times that match when we most need energy, like on summer afternoons.

Ocean power, Ocean State

Offshore wind energy of a different sort was palpable in the industry’s annual offshore wind conference that I attended later in the week, held in Rhode Island. The conference was full of Europeans with boatloads (and decades) of experience, entrepreneurs and advocates who are champing at the bit to make much more happen in the US, and politicians and other decision makers who are working to create the right conditions and remove stumbling blocks for the technology.

The Ocean State just happens to also be the site of that very first Western Hemisphere project. The project, off the coast of Rhode Island’s Block Island, consists of five wind turbines adding up to 30 megawatts.

That’s modest, at a time when offshore wind farms elsewhere in the world include dozens or even hundreds of turbines. But those turbines will produce far more electricity than the people of Block Island alone can use. So part of the project is connecting the island to the mainland and allowing those beautiful electrons to flow into the state’s and region’s electricity grid. That means the project is also offering bill savings for islanders who have heretofore been completely dependent on local generators fueled with imported diesel.

 N. Bolgen)

Grace and power, and it’s ours at last (Credit: N. Bolgen)

Beyond the Ocean State

Even more importantly, the pioneering project’s implications stretch far beyond its megawatts and electrons.

At last week’s offshore wind conference, Rhode Island Governor Gina Raimondo expressed her deep belief in the “job-creating abilities” of new industries and innovators, like offshore wind and its business proponents. The developer of the Block Island wind farm suggested that we’re “finally at the start of something much, much bigger.” The president of the American Wind Energy Association said that this is “the dawn of a new era” in American history.

And, indeed, the path for much more offshore wind in the US is clearer than ever:

Governments are acting, too. Massachusetts’s new energy law will drive the development of offshore wind capacity equaling more than 50 Block Island projects over the next 15 years. New York sees offshore wind as a really important piece of how it’s going to make good on its new commitment to get 50% of its electricity from renewable energy by 2030.

Generation beyond our generation

All of that—the present, the future, the promise—made my son and me glad to get to visit Block Island last week.

As our party completed the two-hours-by-car-one-hour-by-ferry-and-then-bike trek to the island’s southeastern bluffs and stood looking out at the five graceful towers rising from the Atlantic just three miles away, I thought about how watershed-y this moment was. When I first laid my hands on a solar panel, many years ago, the solar industry was underway around the world; small, yes, but present in niche applications. When I first got involved with land-based wind power, a decade ago, it was nowhere near where it is now, but already a force in the US power sector.

But for offshore wind, and for us, this one project represents the difference between no offshore wind power in the Americas, and yes offshore wind power. An infinite bump-up, ratio-wise, from 0 megawatts to something much greater than zero. One small step for New England (maybe), but a giant leap for all Americans (definitely).

This is certainly only the beginning of our offshore wind work as a society. As one offshore wind expert put it, “If 2016 is the year US offshore wind arrived, 2017 will determine if it thrives.” We’ll have to keep pushing to remove barriers, to drop costs, to create jobs and protect wildlife, to make offshore wind a real and vibrant piece of our mix of electricity options.

But for now, this is a moment worth relishing. It’s not often that you get to be present at a First like this, at the birth of a whole new way of transforming a major sector of our economy, and to get to take the next generation with you. It’s a trip worth making.

 J. Rogers)

Beauty and the beach (spot the wind turbines) (Credit: J. Rogers)

John Rogers

“Unstoppable” Destabilization of West Antarctic Ice Sheet: Threshold May Have Been Crossed

UCS Blog - The Equation (text only) -

Losing all the ice shelves of Antarctica would be like losing each flying buttress that supported a gothic building.  Collapse is the inevitable result.  The question is how fast is the collapse in the case of an ice sheet that would, as Richard Alley told Congress in Feb 2007, slowly spread outwards and flatten like pancake batter that was just plopped on a griddle. Nearly a decade later, the latest science indicates a critical threshold may have already been crossed.  Glaciologist Eric Rignot described this threshold – the retreat of ice in this part of Antartica and its draining into the Amundsen Sea could be “unstoppable”. Many scientists think this is a key region that can lead to the disintegration of the vast stores of marine ice in the West Antarctic Ice Sheet (Figure 1).  The latest study by Khazendar, Rignot and others, adds to the mounting evidence that the threshold for an irreversible disintegration has begun.

View under the West Antarctic Ice Sheet - NASA

Figure 1. One panel depicts glaciers of the Amundsen Sea sector of the West Antarctic Ice Sheet. The floating ice shelves and adjacent tributary glaciers of the West Antarctic Ice Sheet have flow lines indicating areas of faster flow toward the sea. The second panel depicts the same region with the bathymetry revealed by airborne radar surveys. Brown indicates bathymetry below current sea level and green indicates topography above current sea level. Deep brown regions are areas that would likely not be able to stop the flow of the ice as it becomes unhinged from the bottom bathymetry, allowing seawater to flow beneath the ice. Labels added to the juxtaposition of the two original figures from NASA http://bit.ly/2f1jjFR

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

New evidence published in October 25, 2016 by Khazendar and colleagues in Nature Communications, suggests that the buttress effect of the ice shelves of the Amundsen Sea Embayment of the West Antarctic Ice Sheet may be in jeopardy.  The most likely culprit is warm ocean water melting the underside of the ice shelves that are floating over the shallow sea and attached to the ice sheet on the landward side.  This can become a “runaway” situation as each ice shelf thins and becomes separated from the bottom bathymetry that previously helped keep the warm seawater away.  More of the seawater can then flow further underneath the ice shelf which in turn leads to rapid shrinking of ice shelf volume.

How do we know this?

There are two telltale signs that this is occurring:

(1) There has been a shift in the position of the ice shelf and bottom bathymetry contact point, known as the grounding line.

(2) The ice shelves have thinned.

Evidence mounts regarding the first sign.  NASA’s airborne Operation IceBridge used radar to penetrate through the ice sheet to reveal the grounding zones of the Crosson and Dotson ice shelves.  These buttress the Smith, Pope and Kohler glaciers of the West Antarctic Ice Sheet. The grounding lines retreated 40 km ( ~25 miles) since 1996 at Smith Glacier and 11 km (~7 miles) since 2011 at Pope glacier.  The same study recorded a 2 km (~1 mile) readvance since 2011 of the Kohler glacier grounding line after a prior study logged dramatic retreat between 1992 and 2011. Over that same time period, two major glaciers nearby also retreated significantly.  The center of the Pine Island Glacier retreated 31 km (~19 miles) and the core of the Thwaites Glacier retreated 14 km (>8 miles).

As for the second telltale sign, a few definitions are in order. Ice freeboard is the elevation of the ice above the local sea level and ice draft thickness is the length of the ice below local sea level.   The latest study calculates that between 2002 and 2009, 300 to 490 meters of draft thickness of ice was removed from beneath the Smith Glacier grounding zone. Concurrent laser altimetry measurements of the floating freeboard surface showed a lowering of 30 to 60 meters over the same time period.  This rapid ice shelf thinning was surprising over such a short period.  Recently, the National Science Foundation (NSF) and the British National Environment Research Council released an urgent and massive call for proposals to study the shelf and glacier region of the West Antarctic Ice Sheet undergoing the most rapid change. I call this the “No surprises” investment in societally relevant research with near-term and long-term implications for coastlines around the world. The good news is that NASA scientists have already started the eighth Antarctic Ice Change Airborne Survey.

What does this mean for sea level rise?

The most sophisticated ice sheet models to date suggest that once the West Antarctic Ice Shelf destabilization begins, the initial contributions to global sea levels are at rates we can likely adapt to, followed by a jump to major rates of sea level rise.  The area of rapid change, the Amundsen Sea sector, gets the most attention – it has the potential to release ice volume equivalent to around 1.2 meters (~4 feet) of sea level rise.

The Amundsen Sea region is the key to unleashing the deep innards of the West Antarctic Ice Sheet. The Thwaites Glacier in this region could contribute less than 0.25 millimeters per year (mm/yr) over this century according to one ice model.  Given that current sea level rise rates are around 3 mm/yr, that would represent around 8 percent of the current sea level rise from just this one region of Antarctica.  The estimate from that ice model study is just shy of the 10% contribution to global sea level rise that NSF just announced is already observed coming from the region. This ice model of the Thwaites Glacier region projects that the rate during this century would likely jump to 1 mm/yr sea level rise rate starting anywhere between 200 to 900 years.  The earlier onset is based on results for the ice model using the highest melt rate assumption which matched the rate of observed losses between 1996 and 2013 in the region. A different ice model investigated the region of the entire West Antarctic Ice Sheet.  This research suggests that local destabilization in the Amundsen sector can ultimately lead to complete disintegration of the marine ice of West Antarctica contributing around 3 meters (nearly 10 feet) to global sea level rise over centuries to millennia (Figure 2).

The main reason this can occur is that much of the West Antarctic Ice Sheet has bathymetry below sea level (Figure 1), and once the grounding lines shift past bathymetric “sticking points” the disintegration accelerates as the ice sheet flattens and spreads like pancake batter.  This is where the analogy breaks down, since batter cooks and hardens into a solid pancake, whereas ice melts and flows into the sea. For coastal communities, it is imperative that future research sheds light on what factors could potentially slow down or speed up the pace of sea level rise contributions from the West Antarctic Ice Sheet this century.

Feldmann and Levermann 2015 PNAS Figure 3

West Antarctic Ice Sheet contributions to sea level rise as calculated by Feldmann and Levermann (2015) using an ice model under different durations of perturbation applying observed melt rates in the Amundsen Sea sector of the West Antarctic Ice Sheet. Figure source: http://www.pnas.org/content/112/46/14191.abstract

 

 

 

 

 

 

 

 

 

 

 

 

  Map Images by NASA; Labels added by Brenda Ekwurzel PNAS

Plate of the Union—a 2016 Campaign for a Better Food and Farming System

UCS Blog - The Equation (text only) -

I don’t need to tell you that 2016 hasn’t exactly been the presidential campaign year my colleagues and I imagined when we launched our “Plate of the Union” initiative last fall. The national conversation has taken a few detours (ahem) that have made it challenging to maintain a focus on issues that really matter to American families, like what’s for dinner and how it gets there.

Still, we’ve made some progress highlighting the problems of our food and agriculture system and the ways that ill-conceived and uncoordinated public policies exacerbate them. And I’m confident that—whoever moves into the White House next January 20—they will have heard from a wide range of constituents about the urgent need for food system reform, and the benefits to be gained by taking it on.

Where we’ve been and where we’re going

Last week, Plate of the Union wrapped up its fall Battleground State Food Truck Tour at its final stop in North Carolina. More on that in a moment, but first, a look back over the past year:

  • UCS, Food Policy Action, and the HEAL Food Alliance launched Plate of the Union in October 2015, releasing the results of a national poll showing that American voters on both sides of the aisle care deeply about the state of our food system.
  • In December, UCS launched this campaign video explaining the dysfunctional American food system in both English and Spanish and calling on political leaders to provide healthy, sustainable, and affordable food for all. It won second place in the DoGooder National Awards for nonprofit videos, and has accrued more than 300,000 views on social media.
  • As the primary season got underway with the Iowa caucuses last winter, UCS’s Ricardo Salvador published a letter to the editor in the Des Moines Register calling on presidential candidates to take up food policy reform. Ricardo and UCS Fellow Mark Bittman visited the state, meeting with allies and appearing on local TV and radio. The Register later published an editorial titled “Why don’t candidates talk about food?
  • In July, Plate of the Union unveiled a food (policy) truck at the Republican National Convention in Cleveland and the Democratic National Convention in Philadelphia, where we talked to delegates, elected leaders, and reporters about the campaign. The New York Times even tagged along in Cleveland.
  • This fall we took the truck on the road, zig-zagging through key battleground states, hearing from farmers, chefs, scientists, students, and community food leaders about how the current food system threatens public health, the economy, and the environment in their communities. Concerns about water pollution, inequitable access to healthy and affordable foods, and wages for food industry workers – these and many more resonated with voters.
    • At the tour’s official launch in Stonyfield Farm in Londonderry, New Hampshire, my colleague, UCS board chair and Dartmouth professor of environmental studies Anne Kapuscinski discussed (and published in an op-ed) how a systems approach to our food system – examining its ecological, social and economic domains – helps us identify the challenges we face, but also the opportunities for change, starting with presidential leadership on holistic food policy reform.
    • In Des Moines, we heard from Matt Liebman of Iowa State University, who talked about the need for greater investments in research and incentives for farmers who adopt more sustainable farming practices, which can curb water pollution and save hundreds of billions of dollars in taxpayer money in the process.
    • At the truck’s final stop in Durham, North Carolina, more than 170 community residents gathered for a food and farm policy-themed local candidate forum. Joined by local chefs, more than 20 food and farming organizations, and other community leaders, local political candidates talked about the need to reform food and farm policies.
  • By the end of our food truck tour, Plate of the Union had collected more than 110,000 signatures on a petition calling on the next president to lead on food system reform. We delivered that petition to the Trump and Clinton campaign offices in Iowa a few weeks ago.

Now, as the election season comes to a close, we’re shifting our attention to communicating with each of the campaigns’ transition teams, and preparing to deliver recommendations to the president-elect later this month. Stay tuned for more about that.

In the meantime, here are some images from the Plate of the Union campaign in 2016:

Plate of the Union field director Sean Carroll speaks with participants at Philly Feast at the Democratic National Convention.

Plate of the Union field director Sean Carroll speaks with participants at Philly Feast at the Democratic National Convention.

The Plate of the Union food policy truck at the Republican National Convention in Cleveland.

The Plate of the Union food policy truck at the Republican National Convention in Cleveland.

Plate of the Union's Larry Robinson points to specific food policy recommendations for the next president.

Plate of the Union’s Larry Robinson points to the campaign’s recommendations for the next president.

The Plate of the Union team at the RNC in Cleveland.

The Plate of the Union team at the RNC in Cleveland.

Ohio State University students hear from Plate of the Union at a student-organized "Presidential Picnic" in September.

Ohio State University students hear from Plate of the Union at a student-organized “Presidential Picnic” in September.

The truck makes a stop in Washington, DC.

The truck makes a stop in Washington, DC.

potu-truck-hultgren

Left to right: Ricardo Salvador (UCS), Sean Carroll (HEAL Food Alliance), Rep. Randy Hultgren (R-IL), and Claire Benjamin DiMattina (Food Policy Action).

Rep. Rosa DeLauro (D-CT) stops by the truck.

Rep. Rosa DeLauro (D-CT) stops by the truck.

The Plate of the Union team delivers signed petitions to Clinton and Trump campaign offices in Des Moines, Iowa.

The Plate of the Union team delivers signed petitions to Clinton and Trump campaign offices in Des Moines, Iowa.

 

Fuel Economy Reaches Highest Level Ever as Automakers Continue to Beat EPA Regulations

UCS Blog - The Equation (text only) -

EPA released its annual reports on the fuel economy of new vehicles and how well automakers are complying with regulations—and yet again, new vehicles sold are more efficient than they’ve ever been, while automakers continue to exceed the federal standards.

Fiat-Chrysler CEO Sergio Marchionne said that EPA needs to weaken or delay implementation of fuel economy standards that are working to bring consumers more efficient vehicles…and he is not alone.

Federal fuel efficiency standards have helped put the most efficient vehicles ever into consumers’ hands, but Fiat-Chrysler CEO Sergio Marchionne is still pushing EPA to weaken or delay implementation of fuel economy standards…and he is not alone.

These reports aren’t really news—this is the same thing we have observed year after year since these standards went into place, and it should be in no way surprising.  Despite all their griping, manufacturers are doing well, exceeding standards even while selling near-record volumes of vehicles, including an increasing share of the SUVs that are some of their most profitable.

What is different about this year, however, is that we are in the middle of a mid-term evaluation of these standards, standards which automakers are trying to weaken, even while some manufacturers are actually reducing efficient options for consumers.  It’s clear from these reports, however, that the standards are working as intended and manufacturers have plenty of technologies left at their disposal to continue to meet future standards.

SUVs are not just more popular than ever—they’re more efficient

2015 saw a surge in SUV marketshare, but this was also accompanied by significant improvements in efficiency, buoyed by a bevy of new, smaller crossovers.  This is helping to keep the “trucks” that automakers love to sell ahead of the standards.  In fact, SUVs today are nearing the same levels of efficiency that cars had 10-15 years ago, a feat that would not have occurred without these regulations.

It isn’t just SUVs that are improving, of course—nearly every class of vehicle saw its highest achieved fuel economy ever.  By pushing manufacturers to deploy efficient technologies across all vehicles, these regulations are helping to ensure more efficient consumer choices in every vehicle class.

A variety of technology pathways are helping to achieve standards

A common gripe that we hear about regulations is that manufacturers will be forced to adopt technology X Y or Z, but these standards were designed to be flexible, providing manufacturers choices it can tailor to its own customers and vehicle profiles to improve the availability of efficient options for consumers while maintaining wide ranging options in size, make, luxury, performance, etc.  This is possible because there are a plethora of technology choices for manufacturers that are nowhere near being fully deployed.

Manufacturers have a broad array of technologies at their disposal to continue to use conventional, gasoline-powered vehicles to meet future standards. From gasoline direct-injection (GDI) to advanced transmissions, there are a number of technology pathways for manufacturers, all of which can lead to further reductions in fuel use—the six highlighted here are merely a subset that omits aerodynamic improvements, weight reduction, Atkinson-cycle engines, variable-compression-ratio engines, and much more.

Manufacturers have a broad array of technologies at their disposal to continue to use conventional, gasoline-powered vehicles to meet future standards. From gasoline direct-injection (GDI) to advanced transmissions, there are a number of technology pathways for manufacturers, all of which can lead to further reductions in fuel use—the six highlighted here are merely a subset that omits aerodynamic improvements, weight reduction, Atkinson-cycle engines, variable-compression-ratio engines, and much more.

Case in point: Mazda is among the best-performing automakers and continues to outpace the standards by deploying efficient direct-injection engines across the board, but they have plenty of room to grow when it comes to transmission efficiency or stop-start technology.  On the other hand, Nissan has exploited tremendous improvements in continuously variable transmissions to ensure its cars and trucks are well ahead of the standards, but it has yet to significantly deploy direct-injection or turbocharging to its engines.

In addition to the technologies highlighted by the EPA in the Trends report, the technical assessment report put out this summer noted a number of additional technologies like the use of lightweight materials to reduce weight of vehicles, utilization of advanced valve controls to run the engine in a more thermodynamically efficient cycle like Atkinson- and Miller-cycle engines, and the reduction of “road load” by improving aerodynamics and reducing the rolling resistance of tires.  Beyond conventional technologies, there are also the more than 500,000 electric vehicles sold to-date that are helping manufacturers exceed the standards—while they are not necessary to achieve 2025, they are certainly helping us to meet our oil and climate goals and will become a bigger and bigger part of the new vehicle market as we aim for a more sustainable transportation future.

Working as intended—why slow down?

Customers are buying the most efficient cars and trucks ever in near-record volumes—that is exactly the type of progress we need to accelerate to meet our oil reduction and climate targets, and the light-duty vehicle standards remain the best opportunity to make this happen.

With manufacturers overachieving each and every year of the standards to-date and plenty of technology opportunities on the horizon, it’s tough to imagine why the agencies would give into industry pressure to weaken the standards—reports like this chock full of good news for the auto industry and showing the positive impacts of the regulations will continue to provide the ammunition to strengthen these targets out to 2025 and beyond.

The Paris Climate Agreement Enters into Force: Here’s What Happens Next in Marrakech

UCS Blog - The Equation (text only) -

On November 4, 2016, the Paris Climate Agreement will go into force. This comes only a month after crossing the threshold of at least 55 countries, responsible for at least 55 percent of total global heat-trapping emissions, joining the agreement.

This rapid entry into force is pretty unprecedented for a complex global agreement of this kind and comes just ahead of the next international climate meeting in Marrakech, Morocco, which will be held November 7-18.

So now what’s next for continuing the momentum on global climate action?

From Paris…

The historic Paris Agreement was a signal triumph of global diplomacy, demonstrating our ability to rise above narrow national interests and commit to a shared vision of future good.

Paris was a success because we got:

  • Agreement among 190+ countries. It wasn’t a foregone conclusion though; getting there required a lot of advance groundwork and skillful diplomacy.
  • An ambitious long term temperature and decarbonization goal that sends a strong signal to the global economy about which way we are headed. Countries have committed to the aim of “holding the increase in the global average temperature to well below 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5°C above pre-industrial levels.”
    To achieve this temperature goal, the long-term goal is to “achieve a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century, on the basis of equity, and in the context of sustainable development and efforts to eradicate poverty.” (In other words, a goal to reach net zero global heat-trapping emissions by the latter half of the century, with developed countries taking the lead so that equity considerations for developing countries are respected).
  • A process to raise the ambition of country climate actions over time, recognizing that the current commitments (the so-called NDCs) are not nearly enough. This will happen via a facilitative dialog in 2018, and global stock-taking every five years thereafter.
  • A call for an Intergovernmental Panel on Climate Change (IPCC) special report in 2018 that will highlight the impacts of global warming if the global average temperature rises to 1.5°C above pre-industrial levels (also compared with impacts at 2°C). The report will also highlight the opportunities and challenges of global emission pathways that limit the temperature increase to this level. The report’s release will help frame the urgency of, and opportunity to make, deep cuts in emissions.
  • Commitments (although lacking in sufficient detail) about helping developing countries make the clean energy transition and cope with the impacts of climate change. This is an area where further work, including financial assistance, is clearly required.

Paris gave us hope. Now comes the hard work of implementing and improving upon the Paris Agreement, living into the promises nations made there.

…To Marrakech

The Marrakech climate meeting, COP22, will be all about catalyzing action to deliver on the Paris Agreement. Some might consider this a “boring” COP relative to the headline-grabbing sensation that was Paris. But this is where the rubber meets the road.

das

Alden Meyer, director of strategy and policy for the Union of Concerned Scientists, looks ahead from Paris in 2015.

Here’s what we hope to see:

  • Progress on developing rules, timelines, and processes to help achieve the goals of the agreement.
  • Further elaboration of how developed countries intend to help developing countries switch to low-carbon development pathways and build climate resilience, including firm financial commitments from developed countries and deadlines for meeting them.
  • A clearer understanding of how the 2018 facilitative dialog, alongside the IPCC special report, will help increase the ambition of the NDCs over time.
  • Strong signals from countries about the domestic policies they are implementing or intend to implement to achieve their global commitments.
Reasons for optimism

There are lots of reasons to be optimistic about climate progress these days. Just to cite a few:

At the same time, there’s a lot of work ahead to accelerate and deepen these trends. We also need to invest in a just, inclusive clean energy transition with benefits flowing to all communities.

Building on progress at home and abroad

All countries, including the US, have to implement ambitious domestic policies to live up to their global climate commitments. Current US policies—including the Clean Power Plan and vehicle efficiency standards—are critical to driving down our carbon emissions. We must do more to reach our current NDC goal of a 26-28% reduction below 2005 levels by 2025. This includes action to set methane standards for the oil and gas industry; increase efficiency measures; achieve further emissions cuts in the power and transportation sectors; and build up carbon storage in our forests, soils and grasslands.

Ahead of 2020, we should also put forward a strong 2030 emissions reduction goal and develop policies consistent with that.

The Obama administration’s engagement on global climate policy—alongside key nations like China, India, Mexico, the European Union countries, and Canada—helped secure the Paris Agreement and other bilateral agreements. The US also joined the ‘High Ambition Coalition,’ an effort started by the small island, African, and Caribbean nations to ensure that the Paris Agreement had a strong temperature limit.

The need for that kind of nimble, creative, hard work is more urgent now as we seek to raise ambition and deepen the equity dimensions of the climate agreement. Our actions can help build trust—or conversely, tear it down, depending on our choices.

All eyes on the US

Of course, the nation and the world are waiting to hear who the next US president will be. Regardless of the administration, we need continued strong leadership on global and domestic climate action from the president and from Congress. Mounting climate impacts show we don’t have any time to waste, and we know that a clean energy transition will bring significant health, economic, and climate benefits.

The bottom line: we need our next president and congress to act based on science—the science on climate change is clear, and calls for urgent action.

Oh, and do join the social media thunderclap to celebrate the Paris Agreement’s entry into force on November 4.

An Upcoming Missile Launch by North Korea?

UCS Blog - All Things Nuclear (text only) -

Press reports are saying that North Korea is likely to try another test launch in the next few days of a new missile it is developing. But there is some controversy about which missile that may be.

Based on reports from two recent failed tests, most people assume the upcoming test will be of the intermediate-range Musudan missile, which North Korea has tested either six or eight times this year. But others see evidence that suggests it could be the mysterious long-range KN-08 missile.

What’s the evidence, and what are the implications?

The Musudan missile

This missile, called the Hwasong-10 in North Korea but Musudan in the west (after the region where it was first seen) appears to have a range of about 3,000 km. If it became operational, it would have the longest range of any system North Korea has tested as a ballistic missile (rather than as a satellite launcher).

The Musudan has a very poor test record. Starting in mid-April of this year, North Korea launched six tests from sites on its east coast (near Wonsan) into the Sea of Japan. The first five failed, all but one exploding quickly after launch. The sixth test, conducted on June 21, was successful (Fig. 1). It followed a trajectory that went much higher than normal so that it splashed down at a range of about 400 km. This nonstandard trajectory kept it from overflying Japan, which lies only about 1,000 km from the launch site. Analyzing the trajectory shows that it could reach a maximum range of 3,000 km (1,860 miles) if flown on a standard trajectory.

This maximum range is interesting since it is shorter than the 3,400 km to Guam, which is thought to be the intended target of the Musudan. It is also shorter than the 4,000 km that is frequently reported for the missile, but is consistent with my estimates of the range. That suggests North Korea has run into some design limitations that is keeping the range shorter than it would like.

 Google Earth)

Fig. 1 The flight path of the successful June 21 Musudan test, which was launched from Wonsan and traveled on a lofted trajectory to a range of 400 km. (Source: Google Earth)

The October tests

Press stories in October reported two additional tests—both of which failed—that US intelligence said were also Musudan tests. These two tests were notable since they were launched from a site on North Korea’s west coast (Kusong), unlike the previous tests.

Why switch the launch site? One obvious motivation is that from this site the missile could be launched to its full range without overflying other countries (Fig. 2). This is essentially the same flight path North Korea has used for its satellite launches. So following the successful June test on a lofted trajectory, it might make sense that North Korea would want to follow with a test on a standard trajectory. Since the Musudan is carried on a mobile launcher, switching launch sites in principle should be straightforward.

 Google Earth)

Fig. 2 The path a 3,000 km range test could follow from Kusong. (Source: Google Earth)

For these reasons, the general assumption has been that the two October tests were Musudans, and that the two failures indicate a continuing problem with the missile. Moreover, the assumption has been that the upcoming test would be another Musudan launch attempt.

The KN-08 missile

However, Jeffrey Lewis and his colleagues at the Monterey Institute raise the possibility that the two recent test failures were not of the Musudan missile, but of a longer range missile that may be in development—the KN-08, called the Hwasong-13 in North Korea.

The KN-08 has been seen in parades for several years. It has been described as a mobile long-range missile, but there remains a lot of controversy about whether such a missile is really being developed and what its capability might be if it is.

Technical analysis of the missile has shown that the design seen in parades makes more sense if North Korea were able to go beyond the Scud-level propulsion technology it has used in its previous missiles and use more advanced propellants. That more advanced technology has now been seen in the Musudan tests, which opens up the possibility that KN-08 development is real, and may have reached the point of flight testing.

Studying satellite images of the launch site for the two October tests, Jeffrey found that the burn marks on ground appeared to be larger for these tests than the previous Musudan tests at Wonsan—suggesting the launch of a larger missile. He also points out that the US military has misidentified several other missiles in recent tests, so the claim that the October tests were Musudans may not be air-tight.

While there are uncertainties that make this suggestive at best, it raises an interesting possibility that should be considered when analyzing the next launch.

Implications

As I noted above, if the two October test failures were Musudans, it would suggest serious ongoing problems with the Musudan program, despite the successful test in June. Since North Korea does not have the experience with this new propulsion technology that it does with its older Scud technology, that would not be surprising.

Even if the October tests were not Musudan failures, North Korea would still have no sense of the reliability of the missile based on previous tests. The fact that the last test worked may suggest they fixed something, but a one-in-six test record gives very little information about the chances for success of the next launch.

Conducting three quick launches (Oct. 15, 20, and the upcoming one) despite the failures is an odd way to develop a missile. Typically you would want to analyze the failures to identify and fix whatever the problem was. It may be that North Korean missiles don’t send back detailed information about their internal workings, which missiles elsewhere typically do. Without that information, North Korea’s approach would be to keep launching until it gets things to work.

It’s also possible, of course, that Pyongyang would like to make a splash during the late days of the US election campaign, thinking it would increase the visibility of a successful test. That would set a testing schedule that was not driven by a step-by-step development methodology. If that was North Korea’s motivation, it might want to launch a missile it had some reason to believe would succeed, which would suggest another Musudan test rather than a KN-08. On the other hand, if North Korea was going for maximum shock value—which a KN-08 launch would deliver—it might decide to try a Hail-Mary approach.

If this does turn out to be a KN-08 test, the two October failures suggest it is getting off to a slow start. But it would mean that a real development program is underway and that North Korea is focused on making it work.

Some details

The first four Musudan tests are believed to have been launched from the Kodo peninsula north of Wonsan, on the country’s east coast. The next two, including the successful test in June, were launched from facilities at the Wonsan Kalma Airport. The two tests in October, and the upcoming test, are from the Kusong Panghyon Airport near the west coast of the North Korea.

North Korea conducted a ground test of the Musudan engine in April, and the analysis of that test is what leads us to believe Pyongyang has shifted to more advanced propellants than used in its Nodong engines.

Meeting the Transportation Demands of the Future: It’s All About Options

UCS Blog - The Equation (text only) -

Like most teenagers growing up in suburban Chicago, I couldn’t wait to turn 16 and finally get my driver’s license. The ability to go wherever I wanted, the freedom of not having to ask my parents for a ride, and just the thrill itself of driving were all things I looked forward to. However, I also loved taking advantage of Chicago’s public transportation whenever I could. I’m a big supporter of cities having convenient public transportation options; I feel this way despite the fact that I’m now an engineer for one of the Big Three automakers in Detroit.  

A Tale of Two Cities

I moved to the Detroit area about 3 years ago for a job at Fiat Chrysler Automobiles. I knew that there would be some significant differences between Detroit and my native Chicagoland, but the one that would stand out the most is the state of public transportation in each city. Chicago has an extensive public transportation network, utilizing a mix of buses, subways (the “L”, as we know it in Chicago), and commuter rail that serve both the city and suburbs. All these various modes of transport are managed by a multi-county Regional Transit Authority that coordinates schedules and ticket sales.

//commons.wikimedia.org/w/index.php?curid=2180481

The Detroit People Mover is one of the public transit systems in Detroit. Photo credit: Mikerussell – Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=2180481

Metro Detroit, on the other hand, has five separate agencies operating public transportation, and they currently don’t coordinate with each other very well. For example, if a Detroit resident needs to go somewhere in the suburbs, he would have to take a DDOT bus to the edge of the city, then transfer to a SMART bus. Because of the lack of coordination between the agencies, there are many gaps in the network. Even if there is a route that could theoretically get you to your destination, there is no guarantee that a seamless transfer is possible. Because of this, virtually everyone drives everywhere.

I suppose it shouldn’t be a surprise that public transportation is lacking here. Detroit’s economy is dominated by the auto industry, so anything that dissuades people from buying cars isn’t going to get much support. I myself now work in that industry, so automobile sales matter to me as well. However, I believe that having viable public transportation options is critical for the success of Detroit and many other cities. It is possible to have a successful auto industry while still offering alternatives to people that can’t (or don’t want to) drive all the time.

What’s at Stake

According to the World Health Organization, more than half of the world’s population now lives in urban areas. As metropolitan areas continue to grow, in both population and physical size, transportation needs will grow as well. This will have a significant impact on our resources and infrastructure, not to mention the effect that increased fuel consumption might have on our environment.

Furthermore, our transportation policies can have a big effect on social equity and everyday quality of life. Reliable access to things like jobs, schools, and grocery stores are essential for a successful community, but there are communities that have been left out. Take for example the story of James Robertson, a Detroit resident who walked about 21 miles each day to get to his job, because he could not afford to replace his car and the bus options were very limited. Even in cities with good public transit such as Chicago, neighborhoods with a large minority population unfortunately sometimes have more limited options. As a person of color, I don’t want to see people like me miss out on good opportunities simply because there is no way to get to them. As an engineer, I know it is possible to find solutions to these challenges.

Building a Brighter Transportation Future

Despite all these concerns, there is hope for the future. Ride-hailing and ride-sharing services such as Uber, Lyft, and Zipcar have exploded onto the scene in recent years. Several cities have established light rail and Bus Rapid Transit lines. Auto makers are continuously seeking improvements in fuel efficiency, and investing in hybrid and electric vehicles. There is also a strong push for autonomous vehicles, which could be a game-changer for those who are physically unable to drive, either through private ownership or a ride-sharing system.

Public policy is another area where change is happening. Here in Southeast Michigan, we will be voting on a property tax increase to fund a Regional Master Transit Plan that will bring improved transit options to Detroit and its suburbs. The plan would create several new bus lines, including bus rapid transit, re-establish commuter rail service in the region, and continue funding the brand-new streetcar in downtown Detroit scheduled to open next year. I personally plan to vote in favor of the proposal, because I believe private vehicle ownership doesn’t have to be the only option. Making it easier and more efficient for people to get places leads to better economic opportunities, stronger communities, and a better environment for all.

Bio: Jonathan Tyler is an engineer at Fiat Chrysler Automobiles, working to identify and address manufacturing-related quality issues. He is an active member of the National Society of Black Engineers at both the local and regional level.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs