Combined UCS Blogs

SB 489 is the Clean Energy Catalyst New Mexico Needs

UCS Blog - The Equation (text only) -

BLM New Mexico/Flickr BLM New Mexico/Flickr

In New Mexico, out of the crucible of power-sector transformation, economic vision, and climate imperative comes SB 489, the Energy Transition Act, a bold proposal to set a predominantly coal-fired state down a clean energy path.

Critically, SB 489 balances the urgent hunger for what could be—clear skies, bright futures, good jobs that are built to last—with the inescapable reality of that which is—more than half a century of dependence on coal—by ambitiously committing the state to a forward course while simultaneously reckoning with its past.

With the arrival of a new governor, New Mexico’s clean energy potential has swiftly snapped into focus after long seeming just out of reach. But vision is one thing, reality is another, and it takes a plan to navigate the liminal space.

SB 489 offers that response.

It’s big and it’s bold, setting a power-sector target of 100 percent carbon-free electricity by midcentury, but it’s also careful and considered, looking out for the jobs and economies that were, and shaping for the better the jobs and economies that will be.

SB 489 is supported by environmental, community, labor, and conservation groups, and has the full backing of the governor. It is the right plan, at the right time, for a state on the precipice of change.

Coal

As recently as two years ago, PNM, the state’s largest utility, was fighting to keep coal-fired San Juan Generating Station (SJGS) running through 2053. Now PNM is planning to close SJGS by 2022, and leave coal entirely by 2031.

The fact is, SJGS costs more to keep running than building new resources to take its place. PNM estimates that were it to keep SJGS running for another 20 years, it could cost its customers tens to hundreds of millions of dollars more.

But an exit from coal is not as straightforward as simply snuffing out the stacks. The legacy of coal runs deep. Investments have been made, careers have been built, economies have been centered—and the transition at hand upends all of that.

The departure from coal demands real leadership; officials willing to wrestle with hard truths rather than ducking reality and pretending change won’t come. And that takes leadership now—before plants and mines have closed, before that opportunity has passed.

That takes supporting a proposal like SB 489.

SB 489 begins with the recognition that coal plants are closing and then determines how best to act. It does this by tackling two major fronts: first, how best to protect ratepayers while retiring coal, and second, how best to address the needs of workers and communities at risk of being left behind.

Moving Coal Off the Ledger

As it currently stands, PNM, a regulated utility, earns a profit of approximately $16 million a year on $320 million in outstanding investments at SJGS, paid by utility customers. When these investments were approved, the intention was that the plant would operate well into the future, and thus those costs could be recovered over a long period of time. Now, even abandoned, those debts must still be paid.

Traditionally, the utility would go to the Public Regulation Commission (PRC) and seek cost recovery, which would allow them to continue to earn a return on investment for the amount approved. This would likely result in a lengthy legal battle over just how much should be borne by ratepayers, and just how much by shareholders.

SB 489 provides an alternative to that by adding a new tool to the PRC’s regulatory toolbox: securitization.

Securitization works like refinancing a loan; by gaining access to AAA-rated bonds, PNM would be able to secure much lower-cost financing to cover outstanding plant, worker, and facility debts. In the process, the utility avoids taking a write-off but also forgoes earning returns.

Customers could stand to save as much as 40 percent via this form of plant closure, which frees up capital to advance workforce transition and economic development efforts. Of course, that “could” is crucial: baked into SB 489 is a requirement that the PRC only approve such a proposal if it guarantees ratepayers a certain amount of savings—and the bill provides funding for them to hire an expert to ensure such savings are the case.

At the end of the day, this approach embraces compromise as a means of moving forward; it looks to unite state, customer, and utility interests under single cover, working to navigate a viable path out of the deep and tangled morass.

Supporting Coal Workers and Coal Communities

Concurrently, SB 489 works to support the transition ahead for those coal threatens to leave behind.

SB 489 dispenses with the myth that the only way to shift from coal is to forsake jobs and local economies. That false choice, which dishonestly perpetuates hope for a commodity teetering on the brink of collapse, has done nothing to ready those at risk. It has done nothing to tackle severance, retraining, reclamation, or economic development. It has done nothing to tackle reality.

SB 489 does.

First, the proposal would set aside $20 million for severance and job training for employees losing their jobs. Importantly, this covers not just plant workers, but mine workers, too; the mine serving SJGS only serves SJGS, and its owner is presently in bankruptcy—these workers need significant and dedicated support. SB 489 would put approximately $15 million more into an Energy Transition Displaced Worker Assistance Fund, to be administered by the state’s Workforce Solutions Department, to further assist in workforce training. In addition, SB 489 requires that starting in 2020, a growing share of electricity generation projects host apprenticeships, thereby driving the development of a skilled workforce trained for good jobs that are built to last.

Second, SB 489 would set aside $30 million for plant decommissioning and mine reclamation costs. This is not a limit; it simply secures the first $30 million via the securitization mechanism. Careful and thorough decommissioning and reclamation are essential for enabling the area to move beyond its coal-fired past. At the same time, these activities have the potential to create good jobs, with overlapping skillsets, for many years to come. Thoughtful planning for how to leverage this critical undertaking can further benefit the local economy.

Finally, SB 489 relieves the community of being backed into the untenable position of fighting for something they know won’t last simply because it’s the only option they’ve got.

SB 489 approaches this in two ways. First, it allocates over $5 million to an Energy Transition Economic Development Assistance Fund, to be administered by the state’s Economic Development Department in concert with community input, to foster economic development in the area. Second, by leveraging the area’s existing energy infrastructure, SB 489 directly addresses the expected erosion of the local tax base by directing replacement power to be located in the affected school district. This action could be significant; a recent analysis found that building a 450 MW solar plant in the area could replace all lost property tax revenue, as well as generate thousands of construction jobs and generate tens of millions more in additional state and local taxes.

Renewables

Facilitating the transition away from coal is only part of the story; it does not address what comes in to take coal’s place. And that affects everything that comes next for the state, because as goes coal today, so goes natural gas tomorrow.

New Mexico cannot allow for its shift from coal to become a shift to gas. The risks of natural gas overreliance are significant, and largely borne on the backs of ratepayers. What’s more, a portfolio dominated by renewables has been repeatedly shown to be the most cost-effective option for the state.

And that makes SB 489’s concomitant strengthening of the state’s Renewable Portfolio Standard (RPS) so critically important. By building from existing policy, SB 489 steadily strengthens renewable resource requirements from 20 percent by 2020 to 50 percent by 2030 and 80 percent by 2040 for large utilities, and 80 percent by 2050 for co-ops.

But SB 489 doesn’t stop there. It further commits to a power sector 100 percent carbon-free come 2045 for the utilities, and 2050 for the co-ops.

Which means that every investment decision that gets made from here on out will now be evaluated in the context of this carbon-free energy course. It makes clear where the state is headed, and requires utilities to fall in line.

This is a major achievement, made all the more remarkable by the fact that it’s supported by utilities and co-ops alike.

SB 489 as energy transition guide

SB 489 offers a clear and convincing roadmap to navigating the transition ahead.

Critically, essentially, it begins not by leaping to where the state is going, but instead by reckoning with where it’s been. It is considerate of those the transition away from coal risks leaving behind, and works to ensure that they’re readied to be a part of what’s to come.

And with SB 489, the state’s future looks bright.

SB 489 boldly commits New Mexico to a rapid power transition, positioning the state as a renewable energy leader and signaling to forward-looking companies that this is a place to invest.

SB 489 doesn’t solve it all. The transition to a clean energy economy will take efforts large and small, ground-up solutions alongside top-down guidance and everything in-between. But SB 489 is a powerful place to start, and with major long-lasting investment decisions looming on the horizon, now is the time to begin.

Energía renovable en Latinoamérica y el Caribe: una gran riqueza que brilla cada vez más

UCS Blog - The Equation (text only) -

La ceremonia de los Grammys 2019 empezó con una energizante presentación de Camila Cabello (cubanoamericana), Ricky Martin (puertorriqueño) y J Balvin (colombiano). Fue muy emocionante ver a latinos abriendo por primera vez uno de los eventos más importante de la industria musical. Si bien la presentación me hizo recordar la increíble riqueza musical de los artistas de Latinoamérica y el Caribe (LAC), la región también cuenta con una impresionante riqueza en recursos naturales como el sol y el viento para generar energía.

Camila Cabello, Ricky Martin y J Balvin abriendo la ceremonia de los Grammys 2019

Mas, ¿qué metas están llevando a que esta riqueza sea aprovechada? Acá les cuento.

 Visión: metas claras que guían el rumbo de los países

Los precios de la energía solar y la energía eólica han bajado considerablemente y su adopción ha aumentado casi que exponencialmente en toda Latinoamérica y el Caribe.

 

Tendencia en Capacidad Instalada en Centro América y el Caribe

 

Tendencia en Capacidad Instalada en Sur América

Adicional a la reducción en costos, un componente vital en la transición de combustibles fósiles a fuentes de energía limpia es el liderazgo de los gobiernos. De aquí que las metas que estos establecen en cuanto a reducción de emisiones de cambio climático e integración de energías limpias son vitales para la definición de políticas públicas que apoyen el logro de dichas metas.

A lo largo y ancho de Latinoamérica y el Caribe, los gobiernos han establecido ambiciosas metas de adopción de energías renovables, de reducción de emisiones de cambio climático, e incluso de abolición del uso de combustibles fósiles. Por ejemplo:

  • Costa Rica anunció en mayo del 2018 que para el año 2021 abolirá el uso de combustibles fósiles. Adicionalmente, por cuatro años consecutivos el país ha superado el 98% de generación renovable en su sistema eléctrico.
  • México ya cubría en junio del 2018 el 24% de sus necesidad eléctricas a través de la generación con fuentes limpias y tiene por meta que el 50% de la electricidad deberá generarse con energía limpia para el año 2050.
  • Jamaica ya cubre más del 18% de sus necesidades energéticas con energía renovable y tiene por meta generar el 50% de su electricidad usando energía renovable para el año 2030.
  • Colombia busca tener al menos 500 megavatios (MW) de energía renovable no convencional para el 2022, o alrededor del 10% de la matriz energética. En la actualidad cuenta con 50 MW.
  • Chile cuenta con la radiación solar más alta del mundo y fue elegido como el país más atractivo para invertir en energía renovable por ClimateScope en el 2018. Al final del mismo ya producía cerca del 20% de su energía usando fuentes renovables no convencionales, logrando llegar a la meta propuesta para el año 2025 casi 7 años antes.
  • Uruguay, gracias a su proceso de reconversión energética, pasó de tener tan sólo un 1% de su energía generada por energía solar y eólica en el año 2013, a un 32% en el año 2017.

Adicionalmente, medidas que garanticen la seguridad, resiliencia e independencia energética están siendo estudiadas en diferentes países de la región. En el caso de Puerto Rico:

La energía renovable en Latinoamérica y el Caribe, un baile para todos

Latinoamérica y el Caribe se unen cada vez con más fuerza y entusiasmo al baile de la energía renovable, y esto es motivo de celebración. La transparencia, la transición justa y la participación comunitaria serán aspectos claves para hacer que este baile sea tan contagioso y energizante que todos podamos participar.

 

 

 

 

 

@Camila_Cabello IRENA IRENA

Chaco Canyon at Risk: Interior Nominee Bernhardt Wants to Drill on Lands Sacred to Tribes

UCS Blog - The Equation (text only) -

Fajada Butte, Chaco Canyon National Historical park, New Mexico. Photo: Adam Markham.

The push to open our fragile public lands to more drilling is well and truly on, and it’s clear that David Bernhardt, President Trump’s choice to become the new Secretary of the Interior, is pulling all the strings.

Throughout the government shutdown in January, former oil lobbyist Bernhardt stayed on the job as Acting Secretary, working hard to push forward plans for oil drilling, including in the Arctic National Wildlife Refuge, and ensuring that the administration’s goal of “energy dominance” through opening new areas to fossil fuel extraction remained on track.

During the shutdown, 800 employees of the Bureau of Land Management (BLM) were authorized to stay at work to process oil and gas drilling leases. Meanwhile 85% of the rest of the staff at the Department of the Interior (DOI) were furloughed, cutting off Native American healthcare programs, shuttering vital climate science research, and leaving national parks like Joshua Tree and Virginia’s civil war battlefields unprotected against vandalism and looting.

Drilling threatens lands near New Mexico’s Chaco Canyon UNESCO site

What’s happening under Bernhardt’s watch in the remote Greater Chaco Area of northwestern New Mexico illustrates in microcosm why he is perhaps the worst possible choice for the job as top steward of our public lands.

Chaco Canyon and thousands of Indigenous peoples’ sacred places and archaeological sites in the surrounding Greater Chaco Region are at risk from an unprecedented drive to frack and drill for oil and gas. The recent announcement (and then hurried withdrawal) of oil and gas lease sales within the 10-mile informal buffer zone for Chaco Culture National Historical Park shows Bernhardt’s intent, and that the land nearest to the park is not safe from oil and gas drilling.

Existing drilling wells close to Chaco Canyon, and the proposed 10 mile protection zone (in blue). Map courtesy of WildEarth Guardians.

Chaco Culture National Historical Park is centered on Chaco Canyon, which from around 850 C.E. to 1250 C.E. was the center of one of the most remarkable pre-Columbian cultures in the Americas. Chaco Canyon was among the first national monuments created by Theodore Roosevelt under the Antiquities Act in 1907. And in 1987, together with Aztec Ruins National Monument and five smaller “outlier” archaeological sites in the region, it was named a UNESCO World Heritage site.

The Chaco culture evolved and spread in the region and its people left thousands of pueblos, shrines, burial sites, cliff-stairs, track-ways, and ancient roads. Eventually there were more than 200 outlier communities, many connected to Chaco Canyon by roads. All modern pueblo peoples trace their ancestry to Chaco Canyon, and tribes including the Navajo and Hopi claim cultural affiliation with the ancient Puebloans and Chacoans. Most of the Chaco region today is traditionally Navajo land.

An extraordinary archaeological landscape at risk

It’s a rough drive into Chaco Canyon. On the northern access road, the last 13 miles are on a pot-holed and dusty washboard road that can become impassable when it rains. The first thing you see on your left as you turn off NM 550 towards the park is a big fracking well, but as you get closer to Chaco, the landscape is flat and expansive, the desert scrub vegetation is sparse, and grazing cattle and horses are few and far between. The nearest town to Chaco Canyon is 60 miles away and there is no visitor accommodation, merely a campground frequented by coyotes and rattlesnakes under a mesa. It’s an International Dark Sky Park and you’d be crazy not to stumble out of your tent at night into the cool, high-desert air and marvel at the jewel-box-bright stars of the Milky Way spilling though the black-velvet night sky.

Ancient Chacoans were closely connected to seasonal and astronomical cycles, and as you stand on mesa gazing at the night sky, you can’t help but be captivated by thoughts of how these ancient peoples connected with the same awesome spectacle. Today light pollution, associated with methane flaring from drilling sites that are creeping closer toward the park, is a real threat to the extraordinary dark sky views.

Chaco’s a harsh environment: Bone-chillingly cold in winter, dry and sometimes searingly hot in the summer, and with an average of not much more than 9 inches of precipitation annually. But in these forbidding surroundings a remarkable and enduring culture formed and grew. Before Chaco, ancient pueblo people created hunting camps or small villages that lasted a few years, or at most a decade or two. But in Chaco Canyon a culture developed that put down roots and created extraordinary architecture and a complex trade network.

Part of Pueblo Bonito, Chaco Canyon. Photo: Adam Markham.

There are a dozen monumental, multi-story sandstone “great houses” in Chaco Canyon, and the remains of some are in remarkably good condition. Great houses contained store-rooms, granaries, offices, accommodations, circular ceremonial rooms called Kivas, and some probably had military barracks and aviaries for keeping or breeding rare birds. The most famous is Pueblo Bonito, which probably had at least 650 rooms. The great houses seem to have been occupied by an elite class, while the vast majority of ancient Puebloans lived in much simpler buildings.

Archaeological evidence shows that the Chacoans participated in extensive trade networks involving copper, ceramics, turquoise, obsidian and chocolate throughout the Southwest and into Mesoamerica. From around 900AD, they were trading turquoise for scarlet macaws that originated in southern Mexico.

A cultural landscape under assault from oil and gas drilling

The Greater Chaco Region is now under unprecedented assault by the oil and gas industry, with the enthusiastic support of the Trump administration and Acting Interior Secretary Bernhardt. According to WildEarth Guardians, there are already more than 20,000 oil and gas wells in the region, and the drilling is quickly encroaching closer and closer to Chaco Canyon. In early February 2019, BLM announced plans to sell more leases in late March (March 28) for oil and gas extraction, quite a number of which were within a 10-mile radius of the park. Then, a few days later, BLM announced that it was withdrawing the lease sales for sites within 10 miles of Chaco Canyon.

This is a welcome development, but it is unlikely to be the last time that BLM tries to push drilling closer to the park. Archaeologist Paul Reed of the non-profit cultural resources advocacy group Archaeology Southwest says,“I think this is probably a temporary victory, and the parcels will come up again in a future lease sale…I encourage folks to contact BLM to protest the March 28 lease sale, even with the near Chaco parcels removed.” And according to the Society for American Archaeology, land parcels that are still up for lease outside the informal 10-mile buffer zone, but are within the Greater Chaco cultural landscape, also contain important Chachoan remains. The US non-profit advisory body for World Heritage, US/ICOMOS (US Committee of the International Council on Monuments and Sites) has also protested the expansion of lease sales in the Chaco landscape.

Energy development on a Chaco Canyon access road. Photo: Adam Markham

Tribes and archaeologists want a drilling moratorium

Representatives of tribes, archaeologists, environmental advocates, and heritage experts are angry because planning for the new lease sales appears to have continued unimpeded during the recent government shutdown even though the Farmington Resource Management Plan and Environmental Impact Assessment (EIA) have not been completed.

Oil and gas leasing in the area continues despite calls by the National Congress of American Indians (NCAI), the Navajo Nation, and the All Pueblo Council of Governors (APCG) for a moratorium on drilling in the whole Greater Chaco Region, pending initiation and completion by BLM and the Bureau of Indian Affairs (BIA) of an ethnographic study of cultural landscapes in the region. The study has not been initiated and new well openings continue apace. According to the NCAI, more than 400 new fracking wells have been approved in the region since 2013, and approximately 90% of federal lands in the oil- and gas-rich San Juan Basin, of which Chaco Canyon is the geographical center, have already been leased for drilling.

For the protected ruins inside the park and associated protected areas, the primary impact of the expanded oil-shale drilling is from air, noise, and light pollution. But outside the park boundaries, the concrete drilling pads, massive rigs, pump jacks, and dense network of oil industry roads are damaging a huge sacred and cultural landscape left by the Chacoans, and about which we know very little. The burden of increased water and air pollution falls largely on Navajo communities who have little say in the leasing or management of BLM lands.

Oil and gas land grab should disqualify Bernhardt

In May 2018, Senators Tom Udall and Martin Heinrich introduced legislation to ban drilling and fracking on federal lands within 10 miles of the boundaries of the Chaco Culture park. The Chaco Cultural Area Protection Act is also supported by the APCG and the Navajo Nation. New Mexico Congresswoman Deb Haaland, the newly elected Chair of the House Subcommittee on National Parks, Forests & Public Lands, and a tribal citizen of Laguna Pueblo, dubbed the latest drilling leases proposed (and then quickly withdrawn) by BLM a “land grab”, lamenting the lack of consultation with tribes.

David Bernhardt’s DOI is waving aside and ignoring the protests of tribes, Indigenous organizations, environmental groups, archaeologists, and New Mexico’s congressional representatives. Bernhardt, with his history of lobbying for drilling and mining interests, and his tangled thicket of conflicts of interest, seems not even slightly committed to the stewardship of public lands for the benefit of future generations, but only to the short-term benefits of the oil and gas industries. For this reason alone, he is not qualified to be confirmed as Secretary of the Interior.

Photo: Adam Markham

What to look for in Governor Pritzker’s Budget Address

UCS Blog - The Equation (text only) -

Photo: Charles Edward Miller/Flickr

On Wednesday Governor J.B. Pritzker will give his first budget address as Illinois’s 43rd Governor. This is a key opportunity for him to address the financial benefits of renewable energy and a pathway for Illinois to achieve 100% carbon-free electricity.

Last month Pritzker joined the U.S. Climate Alliance, a bipartisan coalition of governors committed to reducing greenhouse gas emissions consistent with the goals of the Paris Agreement.

In his Budget Address Governor Pritzker should take the next step by laying out a plan to achieve his climate commitments. The governor would do well by referencing recent findings and recommendations from the Powering Illinois’ Future Committee and the Illinois Commerce Commission’s NextGrid study. It’s vital that his energy platform be an equitable path forward for the state. Here’s what we hope to see included.

Reference to powering Illinois’ future

Earlier this month Governor Pritzker released a report from the Powering Illinois’ Future Committee, a committee  co-chaired by Jen Walling, Executive Director of the Illinois Environmental Council.

The report outlines an equity focused framework that will build upon the Future Energy Jobs Act (FEJA).  The report recommends that the state commit to 100 percent renewable energy while ensuring a just transition for all communities. Illinois can utilize renewable energy and energy efficiency to advance economic development, improve public health, and create good-paying jobs. The Committee recommended improving the health and safety of the state through equitable and responsible capital investments as well as catalyzing carbon-free energy expansion. Some key recommendations from the Committee include:

  • Ensure housing stock in Illinois is ready for energy efficiency upgrades and to prioritize older housing stock in low-income communities.
  • Create clean energy empowerment zones in rural, transitioning, and communities of color to share in the economic and environmental benefits of Illinois’ shift to a clean energy economy. Through these zones, provide grants to facilitate locally-designed, community-directed clean energy initiatives.
  • Expand electric vehicle charging infrastructure, by providing incentives for conversion of public transit and school buses and offer special rates to school districts that adopt EV buses.
  • Integrate R&D efforts with business creation and compete for federal and private sector energy storage investments in Illinois. Incentivize projects at retired or soon-to-be retired coal plants to spur economic development in those transitioning communities.
  • Support shovel ready solar projects for schools and state-owned properties. Implement the Solar for All program by initiating an additional 100 projects at publicly-owned properties in low-income communities.

These recommendations should be the cornerstone of the Pritzker Administration’s policy platform and they should be included in his capital plan.

Findings from the NextGrid study

Last year the Illinois Commerce Commission (ICC) launched NextGrid, the Illinois Utility of the Future Study. The study was a collaboration between key stakeholders to create a shared base of information on electric utility industry issues and opportunities around grid modernization. It was managed by the University of Illinois and consisted of seven working groups. UCS participated in two of the working groups, Regulatory and Environmental Policy Issues and Ratemaking.

While still in draft form, the report does include three specific recommendations that the Pritzker Administration should use in the areas of electric vehicle charging infrastructure, deployment of energy storage resources to enable further integration of renewable energy, and proactiveness on protecting consumer data privacy.  It is a missed opportunity that the NextGrid process did not result in a roadmap or larger set of policy recommendations for Illinois, but the report does identify areas where Illinois can move forward on policy changes to further the goals of a reliable, affordable, and carbon-free energy grid.

The Draft Final Report does recognize that there is “very broad interest in active participation to mitigate climate change impacts in every possible way”. Participants shared the goal to make the grid greener through the continued integration of renewable energy resources to reduce emissions, and the desire to pursue sustainable ways to meet the state’s energy needs.

There is an urgent need to respond to climate change by decarbonizing the electric grid and the many environmental and economic opportunities offered by advancing clean energy. The consensus points discussed throughout the process and laid out in the final draft report should be utilized by the Pritzker Administration.

A bold agenda for Illinois is needed now

There is no time to waste.  According to the Fourth National Climate Assessment projected changes in precipitation, coupled with rising extreme temperatures before mid-century will reduce Midwest agricultural productivity to 1980s levels without major technological advances. At the same time, we must swiftly and sharply reduce our global warming emissions, so we can avoid even worse impacts. Illinois deserves a healthy economy and environment where all communities can thrive.

The Pritzker Administration has the capacity to innovate around carbon-free energy sources, while creating jobs and protecting the health of all Illinoisans. Governor Pritzker’s leadership on renewable energy and energy efficiency is crucial. During his campaign he stated that Illinois deserves clean air, clean water, and a safe environment where all communities can thrive.  He said he stands on the side of science and believes climate change is real.  Now is the time to put these campaign promises into action.

Photo: Charles Edward Miller/Flickr

SCOTUS Will Decide Citizenship Question in April

UCS Blog - The Equation (text only) -

On Friday, the Supreme Court acted with unusual speed to agree to decide whether the Trump administration can add a citizenship question to the 2020 Decennial Census. The hearing is now set for the second week of April. Earlier in the week, leaders from both political parties in the House of Representatives urged the Court to decide the case before the end of its current term, even though they disagree over what the Court should say.

At issue is the legality of the citizenship question. In January, Judge Jesse M. Furman of the United States District Court in Manhattan decided that Commerce Secretary Wilbur Ross broke “a veritable smorgasbord” of federal rules when he announced that the question would be added last year. Citing the cherry-picking of information and previous misleading statements given under oath, the judge rejected the Trump administration’s rationale for adding the question.

That rationale has centered on the claim that previous Censuses have included a citizenship question, and that the addition of the question is necessary for the effective enforcement of the Voting Rights Act by the Department of Justice’s Civil Rights Division. Plaintiffs counter that there has not been a citizenship question on the Census since the Voting Rights Act was passed in 1965, and they challenge the administration’s rationale, based largely on the reaction of the scientific community.

Census experts ranging from all of the recent past Census directorsnumerous scientists and scientific organizations that use Census data, as well as the nation’s premier civil rights and voting rights groups, have all argued that the addition of such a question threatens the scientific integrity of the entire Census.

Pilot studies and previous analysis revealed that immigrants and other ethnic, racial, and religious groups would be more difficult to reach with the addition of the question. Undercounts of any populations threaten the integrity of the data collected, data used to apportion seats to the House of Representatives, fund over 300 programs that allocate $800 billion in federal funds, and provide private sector market segmentation, consumer analysis, employment and a massive amount of other Census-dependent economic data.

Political scientists and voting rights scholars argue that the distortive effects of such an undercount undermine the rationale for the question, as voting rights litigation cases depend on accurately estimating the number of voting age individuals in specific geographies. They point out that the current methods of estimation, which use anonymous questions asked of populations sampled by the American Community and Current Population Surveys, provide more accurate estimates of the populations in question.

The Trump administration maintains that “[o]ur government is legally entitled to include a citizenship question on the census and people in the United States have a legal obligation to answer.”. Dale Ho, director of the A.C.L.U.’s Voting Rights Project, countered that the January ruling was “a forceful rebuke of the Trump administration’s attempt to weaponize the census for an attack on immigrant communities.”

The 14th Amendment to the Constitution provides that representatives in the House “shall be apportioned among the several States … according to their respective Numbers,” with the “Numbers” determined by “counting the whole number of persons in each State.”

Several advocates at the margins of immigration policy and constitutional theory, including the Heritage foundation’s Hans von Spakovsky, have argued that the term “persons” in the 14th Amendment should be re-interpreted to exclude individuals who are in the U.S. illegally, without the permission of the federal government. Von Spakovsky served on President Trump’s disgraced “Voter Fraud” commission.

The Court should affirm the lower court’s ruling. A corrupted census has untold consequences for the country. And a politicized census is a power grab by entrenched interests. Congress should conduct oversight to ensure that the census is conducted fairly and without bias, and to take steps to ensure that questions are added to the census only after undergoing rigorous statistical testing. We must be vigilant about the integrity of the Census, which is, after all, the story of America.

Roses Are Red, Violets Are Blue, Wheeler Claims Action on PFAS, How Much Is True?

UCS Blog - The Equation (text only) -

Photo: Angelia Hardy/CC BY-SA 2.0 (Flickr)

At a press conference in Philadelphia this morning, EPA Acting Administrator Andrew Wheeler announced the long-awaited action plan for the class of toxic chemicals known as PFAS. During an ABC News interview last night, Wheeler called the chemicals “a very important threat.” Indeed, the scientific literature points to associations between PFAS exposure to kidney and testicular cancers, thyroid disease, ulcerative colitis, increased cholesterol, and hypertension in pregnant women. So what is EPA planning to do and is it enough to protect you and your family from the harmful effects of PFAS?

Source: EPA

The lowdown on the plan

The plan is a regurgitation of the steps that former EPA Administrator Scott Pruitt laid out last year, with no added urgency to meet the outcomes. The report is littered with vague language that gives EPA plenty of wiggle room when it comes to actually taking action.

First, the action plan claims it is “moving forward” quickly with the Maximum Contaminant Level (MCL) process under the Safe Drinking Water Act and will have a regulatory determination by the end of the year. This determination could mean no regulation at all. If an MCL is created for PFOA and PFOS, two of the most widely studied PFAS, every water provider would have to test for these chemicals regularly, be responsible for filtering out the chemicals to meet the safe level, and let consumers know how much of the chemical is in their water.

As the drinking water of millions of Americans has been contaminated with PFAS chemicals, it is absolutely necessary that EPA meaningfully move forward to set enforceable limits of PFOA and PFOS in water, rather than merely paying lip service. This process should of course incorporate the expertise of staff scientists at the agency and be informed by the best available science. But, as EPA hasn’t set a new MCL in over 20 years, it’s difficult for me to muster any confidence that this demonstrably anti-science administration will actually take serious action on this public health crisis.

Next, the plan indicates that EPA will work to classify two of the most well-known PFAS chemicals, PFOA and PFOS, as a hazardous substance under the Superfund statute, CERCLA. This is a necessary step in the right direction, but as enforcement actions have reached historically low levels at EPA, the agency must make sure that the Office of Land and Emergency Management has the staff and resources available to adequately remediate PFAS contamination as quickly as possible and to hold the companies responsible for these releases accountable for paying for its cleanup. The highest level of scrutiny on industry is unlikely as Peter Wright, the nominee to run this office, will enter with hundreds of sites belonging to his former employer DowDuPont—formerly one of the major producers of PFOA. While he has recused himself from involvement on these sites, it is unclear whether agency ethics officials will hold him to his commitment.

EPA’s action plan claims it will add more PFAS to the Unregulated Contaminated Monitoring Rule (UCMR) program so that more monitoring can be done across the country. The plan gives no indication which PFAS will be added or if EPA will continue to use its voluntary guidance of 70 ppt as the threshold, which is arguably not protective enough. This will surely help us understand the scope of the problem, but unless the UCMR includes all PFAS in the class, we still won’t have a real grasp of the current situation and the cumulative exposure. And while knowing is half the battle, what EPA plans to do with that knowledge to help communities is even more critical. Communities that have already been dealing with water contamination, like many families on or near military installations, should be included in health studies and medical monitoring programs.

The plan also includes vague language about how the EPA will “explore data availability” for listing PFAS chemicals on the Toxics Release Inventory (TRI). If EPA does indeed make this change, it would give us a better idea of where these chemicals are produced and released into the environment by making that data publicly available. The plan also includes a strategy for expanding research efforts, including developing new analytical methods and tools to “close the gap on the science as quickly as possible,” which would be very useful. Lastly, the plan includes improvements to communicating PFAS risks, including a toolbox that states and local communities can use to educate the public. Whether or not these toolkits will actually be a useful resource to states and local communities will depend on how well the EPA engages these groups at all stages of development.

What the plan fails to do

It doesn’t plan to regulate PFAS as a class of chemicals, but rather focuses on the two most prevalent and well-studied of the group: PFOA and PFOS. This means that there are still thousands of other chemicals in that same class entering our waterways without any repercussions. Indeed, if EPA was monitoring for all these chemicals rather than just those two, it is almost certain that far more drinking water systems would have contamination well above the current health advisory levels.

The word action is used a whopping 186 number of times in the EPA’s plan, yet the words “accountable” and “responsible parties” are only found a total of 3 and 8 times, respectively. It doesn’t mention how it will ensure that the responsible polluters, most notably 3M, DuPont, and Chemours (a DuPont spinoff) will be held accountable for decades of knowingly poisoning communities downstream of their facilities. Also absent from the action plan is how EPA will work with the Department of Defense to address contamination at many military sites, where it is only starting to be addressed.

While the plan goes into the areas of research that EPA wants to explore, it doesn’t mention how it will help to fill the research gaps needed on the thousands of other PFAS that it has itself allowed on the market without proving safety. This research is necessary in order to regulate these chemicals properly. Additionally, PFAS-containing firefighting foams, one of the biggest sources of PFAS contamination, are now being disposed of at military sites by incineration, the safety of which has not been thoroughly tested. And putting these foams or byproducts in landfills so that they can leach back into the groundwater near communities who are already bearing a chemical burden isn’t a feasible solution either. EPA needs to dedicate funding to research into safe methods of disposal so that getting rid of these chemicals doesn’t turn out to be even more dangerous than keeping them around.

Talking about action to clean up PFAS is one thing, but EPA must accompany its actions with a more genuine commitment to move away from the use of PFAS as a class. The health effects of these chemicals have been known about for decades, yet the federal agencies have sat on the science and allowed thousands of these chemicals to enter the market, the waterways, and our bloodstreams. It’s high time that more action is taken on the front-end to prove that these chemicals are safe so that individuals living downstream of manufacturers and on or near military bases are not the ones who bear the burden of proving their harm. Wheeler could learn a thing or two about actual action from the states who have taken the lead on protecting their residents from PFAS contamination thanks to the pressure applied by the many community groups across the country.

The man with the plan

After revealing the plan at today’s press conference, Wheeler told the crowd of concerned members of the public and reporters that “Americans count on the EPA every time they turn on their faucet,” and that the agency was “stepping up to provide the leadership the public needs and deserves.” He’s right. We do count on the EPA, but sadly, this administration’s anti-science track record means that our trust in the agency has plummeted. And this action plan doesn’t give me any more confidence in its commitment to keep us safe.

Being that today is Valentine’s Day, I can’t help but think about trust and relationships. Relationship experts usually say that a good way to learn about someone is to look at the company they keep. It’s hard to have confidence that the same man who lobbied on behalf of a coal company for years as it advocated for fewer regulations could be genuinely serious about more strictly regulating a whole class of profitable chemicals. Wheeler also just appointed science advisors whose opinions lie outside the mainstream, including one woman who has actually testified on behalf of 3M in defense of PFAS.

Another piece of advice that relationship experts espouse is that you should never go into a relationship expecting that a person will change: they are who they are. I have no reason to believe that Acting Administrator Wheeler will prove them wrong and go from coal industry lobbyist to champion of public health, which is why senators should think long and hard about their votes when it comes time to confirm him as Administrator.

This wishy-washy plan demands strong congressional oversight. Take meaningful action today by contacting your members of congress to urge them to join the recently created PFAS task force (and scientists- you can use your expertise to encourage Congressional oversight here).

10 Things the Department of Defense Needs to Include in Their New Climate Change Report

UCS Blog - The Equation (text only) -

The USS Ashland, followed by the USS Green Bay, prepare for replenishment with the USS Wasp, not shown, in the Philippine Sea, Jan. 21, 2019. Photo: U.S. Department of Defense

After a dearth of action on climate change and a record year of extreme events in 2017, the inclusion of climate change policies within the annual legislation Congress considers to outline its defense spending priorities (the National Defense Authorization Act) for fiscal year 2018 was welcome progress. House and Senate leaders pushed to include language that mandated that the Department of Defense (DoD) incorporate climate change in their facility planning (see more on what this section of the bill does here and here) as well as issue a report on the impacts of climate change on military installations. Unfortunately, what DoD produced fell far short of what was mandated.

With such a report, Congress was aiming to understand the most at-risk installations and the types and costs of mitigation measures that can help to ensure mission readiness. The Center for Climate and Security provides an excellent briefer on why these analyses are needed.

On January 10, 2019, a month late, the Pentagon released the report. Congressional leadership  (see statements by Senator Reed and Representatives Langevin and Smith) were rightly disappointed that the report failed to do due diligence in answering vital questions as mandated by law, such as ranking the ten most vulnerable installations by service branch and estimating the costs of mitigation measures. Experts Mark Nevitt at University of Pennsylvania Law School and John Conger at the Center for Climate and Security also weighed in on the inadequacy of the report. It was therefore welcome news that Representatives Langevin, Smith and Garamendi called for a version 2.0 to be completed by April 1, 2019.

With Congress demanding a thorough revision, what should DoD focus on for v2.0? After doing my own assessment of how DoD’s original report matched up with the legislative mandate, here are the top ten priorities the Department of Defense must include in their revised report to Congress. DoD must:

#1  List the 10 most vulnerable military installations within  each armed service branches

In the first report, DoD focused on 79 military considered priorities for mission assurance and relied on a binary approach, noting the “presence or not of current and potential vulnerabilities” to the installations deemed to be priorities for mission assurance over the next 20 years.  In the revised report, DoD must, as mandated, rank national and international installations by each climate-change related impact for each of the armed service branches.

#2  Include consideration of the Marine Corps Branch

We know that the Marine Corps Base Camp Lejeune was hit hard by Hurricane Florence, sustaining $3.6 billion in damages. Additionally, our analysis found that Parris Island (a Marine Corps Recruit Depot) is highly exposed to storm surge and will be more so in the future. In fact, DOD’s own 2018 climate-related risks report found that:

“[U.S. Marine Corps] sites have experienced impacts from flooding, winds, extreme temperatures, drought, and wildfires. And, that “the impact categories receiving the highest frequency of occurrences are training areas/ranges/facilities and HVAC systems from extreme temperatures.”

With so much documented vulnerability to climate impacts, the omission of the Marine Corps is glaring in the current version of the report and must be addressed going forward.

#3  Include international sites in the vulnerability assessment

In the initial assessment, DoD included just two installations in U.S. territories both in Guam, Naval Base Guam and Andersen Air Force Base, which makes sense given that in 2018 Guam was hit by two major typhoons – Mangkhut and Yutu. However, it is not clear how or why other international sites were not addressed given that the report and the legislation both mention a few examples of their vulnerabilities, for example to the Marshall Island radar installation.

In Addition, the Government Accountability Office (GAO) found that the U.S. military’s nearly 600 overseas sites are critical to maintaining the readiness of military forces and represent a massive investment. In fact, GAO estimated the cost to replace these sites in 2014 to be roughly $158 billion. Moreover, DoD’s own reports include an assessment of sea level rise at over 1,700 coastal installations worldwide and of sea level rise on atoll installations.  Finally, we know that many of these international installations are in need of better planning and implementation of measures to address extreme weather impacts.

#4  Address extreme weather including extreme heat, cold, rainfall and hurricane events

A quote by Lieutenant General Norman Seip, USAF (Ret) in The Hill opinion piece “Our military bases are not ready for climate change” on Nov. 2, 2018. He served in the Air Force for 35 years. He currently serves on the Board of Directors of the American Security Project, which published an analysis of how climate change impacts military infrastructure.

Scientists have found strong evidence that indicates climate change increases the frequency and intensity of events like extreme heat and extreme rainfall from hurricanes. In the original report, DoD was generally responsive to the letter of the law by assessing the risk of recurrent flooding, drought, desertification, wildfires, and thawing permafrost. However, DoD ought to have included additional risks including extreme heat, cold and rainfall events given the evidence of damages these events cause, that they are projected to cause in the future and that Congress directed DoD to include “any other categories the Secretary determines necessary.”  As a start, DoD can utilize their own climate-related risk vulnerability assessment from 2018 that provides a qualitative assessment of extreme weather and climate change related risks on  installations.

#5  Assess all types of flooding individually

DoD must assess sea level rise, storm surge and inland flooding risks independently while also including an assessment of cascading impacts of multiple flood events simultaneously. Otherwise, addressing different types of flooding mixes and masks the risks and impacts of rising seas, recurrent high tide flooding, inland flooding and storm surge.

#6  Include the climate-risk vulnerability methodology

In the revised report DoD must include their methodology on assessing climate-risk vulnerability. DoD also ensure the analyses are based on the latest science on climate change and extreme weather such as the new and major reports including the Fourth National Climate Assessment (NCA4) and the groundbreaking Intergovernmental Panel on Climate Change (IPCC) 1.5 ⁰C report. In addition, DoD has a wealth of excellent climate change related reports that they ought pull from as well (e.g. DoD’s Regional Sea Level Rise Scenario report, among many others).

#7  Include a robust list of mitigation measures for all risks and vulnerable installations

Version 2.0 must address the types of mitigation measures needed for each of the climate change related risks at the most vulnerable military installations.  To do this, DoD can start by utilizing the Naval Facilities Engineering Command’s “NAVFAC Installation Adaptation & Resilience Climate Change Planning Handbook, which provides workbooks meant to help installation planners both analyze and develop actions to address the climate change related challenges. For example, the NAVFAC handbook includes mitigation measures and estimated costs and benefits for measures such as seawalls, flood gates, restoration of natural areas and installation of oyster reefs. Ideally, the report could also provide an overview of the types of measures that could be taken to address the interconnectedness of the natural, built, and social systems which are vulnerable to cascading impacts.

Airmen from the 821st Contingency Response Group setup tent city at Tyndall Air Force Base, Florida, Oct. 12, 2018. The contingency response team deployed to assess damage and establish conditions for the re-initiation of airflow, bringing much needed equipment, supplies and personnel for the rebuilding of the base in the aftermath of Hurricane Michael. AMC equipment and personnel stand by across the nation to provide even more support upon request (U.S. Air Force photo by Tech. Sgt. Liliana Moreno)

#8  Address the costs of different types of mitigation measures

Given the estimated cost of damages from Hurricane Florence on Marine Corps’ Camp Lejeune ($3.6 billion) and from Hurricane Michael on Tyndall Air Force Base ($5 billion), and given the adaptation needs particularly for overseas installations, it’s no wonder that Congress asked the Pentagon to estimate the costs of making installations more resilient. DoD can utilize the cost assessment tool in Appendix G of the NAVFAC adaptation handbook to help define the scope and costs of different mitigation measures.

A recent example from work done by Dewberry for Virginia Beach underscores why cost estimates are so important in planning and budgeting. Dewberry found that measures to adapt to sea level rise could cost from $1.71 billion to $3.79 billion over several decades, but the annualized economic losses of doing nothing could be $50 million in 2050.  While a tremendous amount of time and effort went into this report, it is exactly the type of analysis that is so helpful for planners and decisionmakers.

#9  Assess the growing needs for humanitarian relief and disaster assistance

DoD must include an estimate of the increase in the frequency of humanitarian assistance and disaster relief missions and the theater campaign plans among other aspects that the legislation requests. A recent study confirms that climate change is fueling violent conflicts, migration and refugees. The recent 2019 Worldwide Threat Assessment of the US Intelligence Community again makes the case that climate change is a national security threat, and a threat multiplier when it comes to humanitarian disasters.

#10  Provide a robust overview of mitigation efforts already underway and address costs

The report makes clear that climate change is “a cross-cutting consideration for our planning and decision-making processes and not a one-off, separate program.  In version 2.0, DoD has the opportunity to provide more depth to the overview of mitigation measures currently in place and what might be necessary to ensure mission resilience. An estimation of the costs of these mitigation measures, at least at some level, ought to also be addressed.

Marines with Marine Corps Base Camp Lejeune help push a car out of a flooded area during Hurricane Florence, on Marine Corps Base Camp Lejeune, Sept. 15, 2018. Hurricane Florence impacted MCB Camp Lejeune and Marine Corps Air Station New River with periods of strong winds, heavy rains, flooding of urban and low lying areas, flash floods and coastal storm surges. (U.S. Marine Corps photo by Lance Cpl. Isaiah Gomez)

A revised report is essential to provide vital guidance to Congress on military readiness needs

Climate change will continue to strain the nation’s resources, resilience and our military’s mission readiness. The scale of the budget needed to mitigate each of the climate risks for each of the most vulnerable installations will be immense, but the cost of doing nothing will be much greater. The good news is, studies indicate that federal investments in mitigation measures is a wise use of taxpayer dollars and can save $6 in future disaster costs, for every $1 spent on hazard mitigation.

DoD now has the chance for a “do-over” of their climate-change risks report to fill-in the many incomplete and inadequate responses.  This vital revision, version 2.0, will be an important contribution to Congress to help members formulate policies and budgets needed to ensure our military’s installations, infrastructure, and communities across all armed service branches, are climate ready.

Our military and intelligence leaders are calling for action and solutions on climate change and recently members of Congress have called for ambitious and urgent action and solutions on climate change. The time is now for action.

Photo: U.S. Department of Defense United States Government Accountability Office The Hill U.S. Air Force, photo by Tech. Sgt. Liliana Moreno U.S. Department of Defense,Photo By: Lance Cpl. Isaiah Gomez

The Journal of Science Policy and Governance (JSPG): Engaging early career researchers in science policy

UCS Blog - The Equation (text only) -

The Journal of Science Policy and Governance (JSPG) was established nearly ten years ago by a small cadre of students and science policy leaders who sought to create an open access, interdisciplinary, peer-reviewed platform for early career researchers (ECRs) of all disciplines to publish well-developed policy assessments addressing the widest range of science, technology and innovation policy topics worldwide.

Today, JSPG is a non-profit organization that has produced 15 volumes addressing a myriad of policy topics including health, the environment, space, energy, technology, STEM education, and defense, as well as science communications and diplomacy. Publication in JSPG is, for many authors, their first experience writing on science policy issues in a format that is accessible to policy stakeholders. Even fewer authors have had the experience of publishing science policy pieces in a peer-reviewed format prior to this experience. JSPG volumes are published on an accelerated timeline (to keep fresh with current debate) and range from succinct op-eds to comprehensive policy assessments to rigorous technology assessments.

Our JSPG team has great fortune of working with an outstanding line-up of authors, as well as editorial board and staff comprised of ECRs and policy professionals, and finally a distinguished advisory board and a governing board of senior science policy thinkers and doers who share our belief that ECRs can and should hone their policy research and writing skills and engage in policy debates. We also do our best to honor JSPG leadership. Our most recent issue was dedicated to the late Homer A. Neal of University of Michigan, who served on JSPG’s advisory board until his passing in 2018, and was a long-time supporter of JSPG and the involvement of ECRs in policy.

More than a research journal: Maximizing impact through partnerships

What sets JSPG apart from other peer-reviewed publications is the outreach and engagement we undertake with our partners and collaborators, enabling us to reach more ECRs and achieve a greater depth of impact for published work.

NSPN-JSPG competition announced at NSPN Symposium in New York City. Photo credit: JSPG

Some examples of our accomplishments include:

UCS and JSPG: Engaging ECRs in science policy

Both JSPG and UCS seek to empower and encourage the meaningful inclusion of ECRs in science policy research, writing, and debate.

Young people comprise an important constituency in most countries, bring in fresh perspectives, and an infectious level of energy to their policy engagement. Whether ECRs seek to transition into policy-oriented careers or engage in policy in industry or academia, we believe they can play an important role in strengthening policy around the world.

At a time when political leaders are facing increasingly complex science and technology policy challenges, ranging from CRISPR to climate change, the need to equip the next generation of research professionals with policy engagement skills has grown. Despite this reality, few ECRs are encouraged to engage in policy during their academic training, and even fewer are offered training opportunities to sharpen their policy engagement skills.

JSPG’s mission and our own personal missions focused on addressing this challenge. In recognition of our long-standing partnership with UCS, we are very pleased to announce a 4-part blog series illustrating how JSPG has helped equip ECRs with science policy research and writing expertise. In these posts, we will explore the professional journey of current and past JSPG editors, authors, and staff, many of whom are also members of UCS Science Network.

This is the first post in the series. In the three subsequent blog posts, our team will:

  1. Connect the journal to international science diplomacy and policy debate
  2. Illustrate the impact of JSPG on the career trajectories of past editors
  3. Provide a perspective on how policy writing skills translate into science communication
We can’t do this alone. Let’s work together

JSPG is actively seeking partners and collaborators. Let’s team up. Please consider joining JSPG’s mailing list and connect with us on Facebook, Twitter and LinkedIn. Together, we’ll strengthen the ability for ECR to substantively share their ideas on cutting edge science, technology, and innovation policy.

 

Adriana Bankston is the Director of Communications & Outreach at the Journal of Science Policy and Governance. Adriana manages communication and public relations, social media and marketing efforts on behalf of the journal. Adriana’s personal mission is to improve the biomedical research enterprise by empowering ECRs to advocate for change. Adriana is a Policy & Advocacy Fellow at The Society for Neuroscience, and a Policy Activist at the non-profit Future of Research. Adriana obtained her Ph.D. in Biochemistry, Cell and Developmental Biology from Emory University. Find her on Twitter at @AdrianaBankston.

Shalin R. Jyotishi is the Chief Executive Officer of the Journal of Science Policy and Governance. His background intersects innovation policy and economic development. He has held positions at the American Academy of Arts and Sciences, the American Association for the Advancement of Science, the Association of Public and Land-grant Universities, and the University of Michigan. He is a University Innovation Fellow of the Hasso Plattner Institute of Design at Stanford University, a Global Shaper of the World Economic Forum, a member of the United Nations Major Group for Children and Youth, and an editor of the Journal of Economic Development in Higher Education. Find him on Twitter at @ShalinJyotishi.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

US Global Warming Emissions under President Trump: 3 Striking Findings from New EPA Report

UCS Blog - The Equation (text only) -

Photo: USCapitol/Flickr

The EPA just released a draft of its latest annual report on US greenhouse gas emissions. The report documents the amount of heat-trapping gasses—carbon dioxide, methane, and more—that the US has released into Earth’s atmosphere. This year’s edition includes data through 2017 and is notable because it includes information for the first time about global warming emissions that have occurred during the President Trump era.

To spare you the trouble of digging through all 667 pages of the full report, here are a few key findings that run the gamut from good to bad to downright ugly.

1. The good: Total US emissions decreased slightly in 2017

Good news everyone! US greenhouse gas (GHG) emissions decreased 0.3 percent in 2017 compared to 2016. This decline was due largely to the continued shift from coal to natural gas, an increased use of renewable energy, and a year of milder weather that helped cut emissions from the electric power sector by 4 percent in 2017.

Emissions on a per capita and per GDP basis also fell, though population and GDP rose in 2017, offsetting some of these gains. Overall, emissions in 2017 were only 1.6 percent higher than 1990-level emissions, down from being 15.7 percent higher than 1990 in 2007.

Source: EPA Inventory of U.S. Greenhouse Gas Emissions and Sinks 1990-2017

2. The bad: Global warming emissions increased in every sector other than electric power generation

The power sector might be getting cleaner, but no other sector measured by EPA demonstrated any similar progress in decreasing emissions. Transportation, the largest single source of emissions in the US, saw emissions rise by 0.8 percent in 2017 while the industrial, residential, and commercial sectors all emitted at least 1 percent more (see Table ES-2 from the report below).

Moreover, the overall 0.3 percent decrease represented a slower rate of decline than had occurred in 2015 and 2016, when emissions dropped by 2 percent compared to each previous year. This slower rate of emissions decline means that it will be more difficult for the US to keep emissions down as population growth and an improving economy are forecast to skyrocket energy demands over the next several decades.

Source: EPA Inventory of U.S. Greenhouse Gas Emissions and Sinks 1990-2017

3, The ugly: A small year-over-year decline in emissions is nowhere near what is needed to avoid catastrophic climate change

A 0.3 percent decrease in emissions isn’t going to cut it if we’re going to avoid truly catastrophic impacts from climate change, including more deadly heat events, extreme storms and precipitation, property-consuming sea level rise, and reduced crop yields, among other things.

To reduce the worst of these impacts—and limit global warming to 1.5 Celsius—the Intergovernmental Panel on Climate Change (IPCC) found that global net CO2 emissions need to drop 45 percent by 2030 compared to 2010 levels, and reach ‘net-zero’ global emissions by 2050—a far cry from the 0.3 percent reduction seen in the US in 2017. (Net-zero means that any global warming emissions are offset by sinks that take carbon out of the atmosphere, like tropical forests or oceans, or by geoengineering efforts that can cool Earth’s temperature but remain largely unproven and untested.)

As the IPCC puts it, all pathways to limit the global temperature increase to 1.5°C would require “rapid and far-reaching transitions in energy, land, urban and infrastructure and industrial systems.” I wouldn’t call a 0.3 percent decrease in GHG emissions either rapid or far reaching and, given President Trump’s environmental and energy agenda of decreased regulation and more pollution, hope for sweeping action from the executive branch isn’t on the immediate horizon.

There’s still time, however. The pathways the IPCC has identified to meet the 1.5°C target don’t really start kicking into gear until 2022 with the major reductions that are needed to be seen by 2030 (see chart below). For the US, this means that we need to get down to 3,807.63 million metric tons of carbon dioxide equivalent emissions in 2030—a reduction of more than 41 percent compared to 2010 levels. Continued small reductions like 2017’s 0.3-percent drop just won’t get us there.

Source: Global Warming of 1.5°C IPCC Summary for Policy Makers

The Trump administration is moving in the wrong direction on climate change

While the urgency of acting on climate change is more obvious than ever, the Trump administration is instead doing everything it can to delay action and slow progress on cutting emissions. President Trump and his appointees at the EPA, Department of the Interior, and other agencies are pushing for more and more investment in fossil fuel extraction while simultaneously rolling back many of the policies that have helped us bring down emissions.

The administration has announced its intention to leave the Paris climate deal, undermining global cooperation on this vital issue. It’s going after rules that limit power plant emissions and methane leaks, two major contributors to climate change. And it’s trying to freeze improvements in vehicle efficiency, one of the biggest and most significant climate policies on the books.

Things could always be worse of course, and any emissions decline is better than none, but the Trump administration is clearly not setting the country on a path to prevent the worst climate scenarios from becoming reality in the years and decades ahead.

Climate Change and Groundwater: Incorporating Climate Realities and Uncertainties into California’s Groundwater Planning

UCS Blog - The Equation (text only) -

Photo: Craig Ulrich/Berkeley Lab

This blog is coauthored by UCS’s Geeta G. Persad and Tara Moran from Stanford University’s Water in the West.

Dr. Tara Moran is the Program Lead for Sustainable Groundwater at Stanford University’s Water in the West

Climate change is fundamentally transforming the way we manage water in the Western U.S. The recent Fourth California Climate Change Assessment lays out the many pressures facing water managers in California in detail. One key take-away of that Assessment is that past climate conditions will not be a good proxy for the state’s water future, and smarter strategies are needed to manage California’s water. Just one-third of the snowpack that the state has historically relied on as a natural water reservoir is projected to remain by 2100; hotter temperatures will dry out soils faster and earlier in the year; and the atmospheric rivers that already cause intense flooding in the state will likely carry more moisture as the atmosphere warms. All of this will require a transformation in the way that we, here in California and elsewhere, plan for our water future.

Luckily, the science available to us today creates opportunities for water managers and others to plan for changing climatic conditions. California has traditionally relied on historical data to make inferences about future water supply and flood planning. However, climate change is changing our physical system so dramatically that historical data can no longer be used to make accurate predictions about the amount of snowfall, the duration, intensity, and frequency of droughts or floods, and many other climate conditions that affect our water resources. New data released by Cal-Adapt as part of the Fourth California Climate Change Assessment (Figure 1) provides higher spatial resolution climate projections to better estimate future climate change and extreme events. Incorporating these data, and data like it, into water management planning is likely to significantly improve public agencies’ ability to plan for changes in water supply and demand, drought, and floods. Making sure that these data are incorporated, interpreted, and used according to the best available science, though, will be vital to ensuring that water managers get the most benefits from preparing for climate change.

Figure 1| Data from Cal-Adapt shows reductions in June streamflow in the Stanislaus River projected through the 21st century by climate models. Based on data from Cal-Adapt.org.

In this blog, we’re going to walk through some best practices for climate planning in the context of managing one of California’s most important water systems: its groundwater. This builds off of our 2017 whitepaper, “Navigating a Flood of Information: Evaluating and Integrating Climate Science into Groundwater Planning in California”, published by the Union of Concerned Scientists and Stanford’s Water in the West program. With the deadline just under a year away for submission of the first round of groundwater management plans under California’s landmark Sustainable Groundwater Management Act (SGMA), many water managers are grappling with how to incorporate climate science into groundwater management planning.

Climate change in California’s Sustainable Groundwater Management Act

Governor Jerry Brown signing the Sustainable Groundwater Management Act into law in September 2014.

SGMA, passed by the California legislature in 2014, provides a key opportunity for incorporating climate change into water planning. Not only does groundwater serve as a “buffer” during dry times, groundwater aquifers can store huge volumes of water during floods, which is increasingly important as California experiences more extreme events. SGMA provides a statewide framework that incentivizes the flexible management of groundwater basins, in part because it recognizes the impacts that climate change will have on water management in California, and requires water managers to incorporate these impacts.

Climate change is incorporated into SGMA both explicitly and implicitly. Explicitly, SGMA requires the local agencies managing groundwater, known as Groundwater Sustainability Agencies (GSAs), to incorporate quantitative climate change assessments into projected water budgets using a numerical groundwater and surface water model or an “equally effective method, tool, or analytical model” (23 California Code of Regulations (CCR) 23 §§ 345.18(c)(3) and 345.18(e)). Climate change is also implicitly present in the definition of groundwater sustainability in the legislation, which requires groundwater management to be sustainable over a 50-year planning and implementation horizon (California Water Code (CWC) § 10727.2(c)). Over this 50-year window, climate change will continue to transform the conditions under which GSAs are operating. Consequently, GSAs that are unable to incorporate climate change analyses into their planning decisions and adapt their decisions through time are likely to find themselves underprepared.

These legislative requirements create an opportunity for GSAs to understand and incorporate climate change into their Groundwater Sustainability Plans (GSPs) in a way that ensures that their basin is resilient to the increasing water instability of the future. This process is a challenging one, though. An analysis of the 18 Alternative Plans submitted to the Department of Water Resources under SGMA highlights some of the challenges of incorporating climate change into the planning process. Eleven plans incorporate climate change quantitatively. However, the majority of these (6) focus on a single aspect of climate change (Figure 2). For example, multiple plans focus on sea level rise, but don’t consider other climatic variables, like changing water supply, water demand, and hydrology, as required under SGMA. Five plans have no mention of climate change.

Figure 2| Summary of climate change analyses in Alternative Plans for groundwater sustainability submitted to the California Department of Water Resources under the 2015 Sustainable Groundwater Management Act.

These preliminary numbers indicate that agencies managing groundwater do not perceive the risks of climate change equally nor do they take a consistent approach to integrating climate information into the planning process. Additionally, this analysis of Alternative Plans highlights the disconnect between the climate analysis that is being done and the actual management actions that will help protect groundwater resources against climate risk. GSAs recognize that climate change is an issue for sustainable management in their respective basins, but are struggling to identify how to incorporate it meaningfully into their planning process.

Helping agencies to integrate climate change into their Groundwater Sustainability Plans

One thing has become clear during SGMA implementation – each groundwater basin subject to SGMA has a different approach to sustainability, informed by local conditions, data, resources, and a host of other factors. Thus, each GSA will achieve groundwater sustainability through implementation of a unique set of projects and management actions. For example, many basins are currently developing additional recharge capacity and investigating conservation programs to bolster supply and limit overall demand, respectively. Tailoring climate analysis to individual management actions will ensure that the hard work and investment of planning these actions will deliver the desired benefits in the long run.

In our 2017 whitepaper, we laid out an evaluation framework designed to help GSAs do this. The framework walks GSAs through four main steps (Figure 3):

  1. Determine management objective(s). The first step an agency should take is to decide their management objectives – for example, initiatives like conservation targets, developing new water sources, or investing in increased recharge. This information will inform the type of climate analysis necessary to support the objectives.
  2. Evaluate existing climate change information. The second step is to understand the available climate data. The framework provides information to help agencies understand the components of climate data, assess whether the available data is appropriate for their management objectives, and identify gaps that may require new data or analysis.
  3. Select appropriate climate change information. The third step is to connect an agency’s management objectives with the information gathered about the available climate data. Connecting management objectives with climate data allows management agencies to select the data and analysis appropriate for their application. For example, agencies will want to consider whether the available climate data and proposed analysis matches the level of risk tolerance that is appropriate for a particular application and whether they capture the relevant impacts of climate change.
  4. Stress test management objectives. This final step ensures that agencies understand the climate conditions that might cause their management objectives to fail, so that they understand the likelihood and potential warning signs of failure and can plan accordingly.

Figure 3| A framework for integrating climate change analysis into groundwater management planning (adapted from Christian-Smith et al. 2017).

Tailoring climate change information to the management objective

The core question in any climate change analysis should be, “Is the analysis being conducted commensurate with the risk, cost, lifespan, etc. of the water management activity or project being undertaken?” Management strategies that can be readily adapted and with minimal overhead costs may not require extensive climate change analysis. However, high-risk, high-cost infrastructure projects that cannot be readily adapted to changing climatic conditions likely warrant a thorough climate change analysis. Going through the thought process laid out above will allow water managers to ensure that the climate analysis they conduct is appropriate to the water management actions being undertaken in their basin.

To demonstrate why agencies need to tailor their use of climate change information to the management objective at hand, we provide two examples of management actions that a GSA might pursue to achieve groundwater sustainability: developing a conservation target and building a recharge basin.

A management action like a conservation target – for example, a basin aiming to reduce water consumption by 25% over the next 20 years – is fairly flexible. The effectiveness of the target is most likely to be affected by annual-average basin conditions, rather than particular types of short-term extreme climate events. The target can also be adapted as conditions change. For this type of project, median climate changes to the annual average water budget will likely provide a sufficient picture of how climate change is likely to affect the effectiveness of the target. Even for a flexible management action like a conservation target, it may be useful for an agency to “stress test” how its planned target would need to be modified in the face of an extreme climate change scenario. Doing so would provide an agency with sufficient information to communicate to stakeholders and to develop contingency plans to adapt to extreme conditions before they occur.

The delivery point for the Coachella Valley Water District’s Thomas E. Level Groundwater Replenishment Facility.

A management action like building a new recharge basin, meanwhile, requires extensive planning, permitting, labor and capital costs. Once decisions about project conveyance, location, and capacity have been made, they can be difficult to modify without substantial additional costs. Recharge facilities are also impacted by different aspects of climate change than are conservation targets or other management actions. The effectiveness of a recharge basin, relying on diverted, excess floodwaters to recharge a groundwater aquifer, will depend on the frequency, location, and volume of extreme floodwater events. Understanding how extreme floodwater events are likely to change over a project’s 50-year lifespan may significantly alter a project’s design specifications or overall value. This assessment requires information on how climate change will affect maximum daily rainfall, runoff rates, and other climatic variables, rather than annual average rainfall. The relative inflexibility of built projects like recharge basins also makes it even more important to assess how the project would behave in the face of the most extreme types of these events under the most extreme climate change conditions to avoid costly failures or obsolescence. 

What should agencies know about climate change data?

As the examples above show, tailoring climate change information to the management objective is crucial for getting the benefits of smart climate planning. But this requires an understanding of how that information is generated and what it contains, as we lay out in detail our whitepaper.

Most of the information and data about future climate conditions that basins are likely to face comes from the climate projections generated by global climate models, which are essentially numerical representations of the physics of the climate system. To produce future climate projections, global climate models use scenarios of what future climate-warming emissions could be. Because we don’t know what choices society will make that could impact greenhouse gas emissions, a range of possible future emissions scenarios are used. Some of these scenarios generate more severe climate change and some of them generate less severe climate change.

The future climate projection data that comes out of the model will have a certain spatial resolution – the grids of climate model outputs are analogous to pixels composing an image – as well as a characteristic time frequency (e.g. hourly, annually, or monthly) at which the future climate conditions are provided.

When the data comes out of the global climate model, its gridding is at a much larger scale (e.g. 12 grid boxes for the whole state of California) than what a basin-wide or even state-wide hydrological model would need. Translation between these different grid sizes is done through a process called downscaling (i.e. scaling down the grid of the global climate model to be compatible with the grid of the hydrological model). Downscaling can be done in different ways, each of which introduces different biases into the projected climate conditions that will eventually serve as inputs for hydrological models.

As GSAs evaluate and select their climate information for a particular application, here are some of the questions they might ask:

  • What is the spatial resolution of the climate data? Is it sufficient for my application or objective? Does it tell me enough about my basin to make informed decisions?
  • What downscaling technique has been used? Does it retain the type of climate impacts from the original climate model projection that matter for my management objective?
  • What climate variables, such as precipitation or temperature, are included? Do they capture the aspects of climate change that matter for my management objective?
  • What future emissions scenarios are used to generate the climate data? Do they span the range of potential climate change severity that matches the risk tolerance appropriate for my use?
  • How are uncertainties captured? Am I sampling an adequate range of climate models to capture the ways that climate change might evolve in response to a given future emissions scenario?
  • How will the climate data be incorporated into hydrologic models? Is the hydrologic model designed to incorporate all of the aspects of the climate data that I care about?

Asking these questions, although they may require grappling with new concepts, will help water managers to get the greatest benefits from their climate planning. Our whitepaper provides additional background on interpreting and selecting climate data and on best practices for incorporating climate change into sustainable groundwater management.

Getting the most out of climate change planning

Understanding and incorporating climate science into water planning can help water managers make the most of changing conditions. Doing so, however, will require water managers and climate scientists to work together to understand water management objectives and to tailor climate data to those objectives. Before investing in costly infrastructure projects and management actions, GSAs should think carefully about how the information gained from their climate analysis will influence their management decisions. Additionally, climate change analysis should be commensurate with a management action’s risk, cost, and lifespan. As climate change progresses and our understanding of the science and risks increases, building flexibility and iteration into the planning process will also provide benefits. At the end of the day, climate change planning needs to be incorporated not only into water budgets but also into individual management actions to ensure their resilience and long-term viability. Doing so, will help ensure that water management agencies are developing actions and projects that will continue to provide vital water resources to Californians in our changing climate future.

Cal-Adapt CA Dept. of Water Resources UCS UCS CA Dept. of Water Resources

Trump’s Tariffs May Have Cost the Solar Industry Thousands of Jobs Last Year

UCS Blog - The Equation (text only) -

Source: The Solar Foundation

The latest solar jobs census has just come out, and the news is… mixed. Here’s what the survey found, why the numbers are that way, and how we get the whole country on track.

Solar job numbers fall

The National Solar Jobs Census from The Solar Foundation is the non-profit’s annual review of “the size and scope of employment in the US solar energy industry… [and] the most comprehensive and rigorous analysis of solar labor market trends in the United States.” As such, it’s a really important tool for assessing our job-creation progress in what had been, until the last two years, a key growth area within our energy sector.

For 2018, the numbers are a mixed bag. Overall numbers for people employed in solar in 2018 stood at around 242,000—down 8,000, or more than 3%, from the year before.

At the level of individual states, there have been winners and losers. California’s drop in solar employment alone (down 9,600) could account for the whole overall drop, though with 77,000 solar workers it’s still by far the biggest state for solar jobs (and #3 in solar jobs per capita). Massachusetts, which had been #2 in overall solar jobs, also lost—with 1,300 fewer people employed in solar—and dropped to #3 by that overall-solar-jobs metric.

In all, 21 states lost ground, including 4 of the top 5 solar states by installed solar capacity (California, plus North Carolina, Arizona, and New Jersey).

On the plus side, that still leaves 29 states gaining solar jobs. Those include Florida (up 1,800, to take the #2 spot for total jobs), Illinois (up 1,300), and Texas and New York (each up 700+). And the 2018 total solar jobs figure is still 16% higher than the 2015 number.

But the annual drop—the second in a row—is concerning. So the next logical question is: Why are we losing solar jobs at all?

Why solar jobs have fallen

Many of the reasons for the recent drop in solar workers are pretty clear, actually, and start at the top:

  • Trump solar taxes – “Much of the decline,” says the new report, “is attributable to uncertainty over the outcome of the Section 201 trade case on solar modules and cells.” This is that solar tariff issue that was brewing for much of 2017 and finally settled in 2018, with substantial taxes on virtually all imported solar goods. Those tariffs, and particularly that uncertainty surrounding how high those taxes would be, and when they’d hit, were pretty disruptive. And, by slowing down progress with a key technology (and proven job creator), were a blow to any notions about US “energy dominance.”
  • State policies – At a more local level, states were certainly another piece of the pain, with uncertainty being the enemy of investment. California was figuring out what new high bars to set for itself in the climate and energy spaces. Massachusetts continued to wrestle with coming up with a worthy successor to its solar policies that had been so successful at building the local industry over the previous decade.
  • 2016 – Another noteworthy factor is the industry’s own big push in 2016, when it looked like the sizeable federal investment tax credit (ITC) was set to expire; that led to a full court press by industry and customers, and a banner year. That helps to explain 2017 (as the market recalibrated a bit), though, and not necessarily 2018; with a supportive policy environment from the top on down, this past year could have been much stronger.

One thing that doesn’t seem to be a reason for a drop in solar jobs is competition from coal. Given our president’s purported focus on jobs, and his (unhealthy) obsession with fossil fuels, you might well imagine that the job losses in solar have been made up in coal mining. The facts, though, say otherwise: The 1,800 new coal mining jobs from January 2018 to last month stacked up to less than a quarter of the 8,000 jobs lost in solar.

(Those increases also brought coal mining up to an estimated 52,700 jobs, which, astute readers will notice, even if Pres. Trump doesn’t, means that there are still 4.6 people employed in solar for every 1 in coal mining.)

Credit: Dennis Schroeder/US Department of Energy, via Flickr

The way forward

So, how about 2019 and beyond? There are reasons to think some of the job numbers might right themselves. The strong performers in 2018 look set to continue to perform well:

  • Florida (the Sunshine State) has finally discovered solar power, as regulators finally began to let homeowners take advantage of solar leases, a financial tool which has been a powerful driver elsewhere.
  • Illinois has been buoyed by the solar provisions in its important 2016 Future Energy Jobs Act.
  • Texas is finding that large-scale solar’s really low prices are a good complement to its nation-leading wind fleet.
  • Nevada is looking good after fixing a bad decision about net-metering a couple of years ago.

Meanwhile, erstwhile leaders look to be getting back on track. California has a new requirement to put solar on virtually every new home, and a goal to get 100% of its electricity from carbon-free sources. Massachusetts finally has its new SMART solar incentive in place, removing the policy uncertainty for at least a little while.

But this is also time for national leadership on clean energy policy. If the White House doesn’t feel compelled to provide it, at least many in Congress do—with Exhibit A being the Green New Deal. UCS was “excited and heartened” to see the GND resolution introduced last week, with its “bold, ambitious vision for how to address climate change in a principled, equitable and science-based manner.”

From where I sit, it seems pretty clear that any bold vision for our climate future has got to include a large role for solar power, with all its potential for cutting carbon, improving public health, enhancing resilience, empowering communities, and, yes, creating jobs.

Getting us squarely on the path toward climate sanity will involve some tough choices. Solar isn’t one of them.

Meanwhile, solar panel installer positions represent the fastest source of job growth in 8 states, according to Bureau of Labor Statistics numbers examined by Yahoo Finance. There are a lot of reasons to think that number of states should be a whole lot higher.

Check out The Solar Foundation’s infographic for more facts and fun.

People are the Purpose of Science

UCS Blog - The Equation (text only) -

Nakala was only four months old, a chubby and cherubic baby, when I saw her that summer for a routine check up at our pediatric clinic in Flint. Her mom Grace told me she was going to stop breastfeeding. That didn’t surprise me. I tried to convince her otherwise, but for young moms in Flint, it’s often regarded as a complicated hassle — and hard to do when you’re struggling to hold down a job.

Grace planned to mix powdered formula with tap water for Nakala. She asked me pointedly if the Flint water was okay for that.

“Sure,” I said without hesitation. “Don’t waste your money on bottled water.”

It was August, 2015. I’d heard some news reports of citizen protests about the drinking water; it had become background noise. As a pediatrician, all I had to go on, besides the fact that it was twenty-first century America, was what the other experts, the scientists who worked for the state, had to say.

And they were adamant. Oh yes. They were cocksure and confident — repeatedly issuing statements that there was no room for doubt. The tap water in Flint was fine. The mayor had even gone on TV, turned on a Flint faucet — and drank some.

What is science for?

Scientists like to talk about what they are “solving for” in their work. In classrooms all over the world, students are told that the purpose of science is “explaining and predicting our world.”

Is that enough?

I don’t think so. Not after what I discovered that summer in 2015. Explaining the world isn’t enough. Predicting isn’t either. In my book about the Flint water crisis, What the Eyes Don’t See, I share the story of how the most egregious present-day example of science denial unfolded — and how the government scientists knew that a powerful neurotoxic, lead, was present in the Flint water for months, but did nothing. Instead, they hoped their expertise and titles would shield their lies from being exposed. In Flint, ignoring science led to the poisoning of an entire city’s water system.

It pains me that so many of the people who should have been looking out for the children of Flint, but who failed them instead, were scientists: doctors, epidemiologists and engineers. I know it seems an exaggeration to compare what happened in Michigan to something as terrible as the Nazi doctors who participated in the Holocaust, or the scientists involved in Tuskegee syphilis study, or the military psychiatrists who participated in torture.

But is it all that different?

Our children cannot afford to have science and scientists shut their eyes, look away, and stay silent to injustices.

Solving for human progress

In What the Eyes Don’t See, I reflect on the work of some groundbreaking scientists — primarily the big troublemakers of public health, Alice Hamilton and John Snow. Rather than going along with consensus or standing on the sidelines, they were passionately involved in their communities. Their work wasn’t about abstract scientific discovery alone. It was about people and community, working in partnership.

That is what science should be about — and what scientists should be solving for. It isn’t just an academic exercise for the ivory tower, to rack up publications, grants, and offers of tenure. Sure, being able to increase our understanding of the world around us is essential. And making better predictions is crucial. Without question, scientific advances are a foundation of modern civilization and economy. But as twentieth-century history illustrates so well, scientific advances aren’t limited to wonders such as antibiotics. It also includes such evils as nuclear weapons. The consequences of advances in science, and the application of technology, cannot be divorced from scientific discovery.

Discoveries alone aren’t enough. Science should be solving for human progress. The promise of science is how people and communities – and the environment – benefit from scientific inquiry and innovation.

Simply put, the purpose of science must be to do good. And a logical extension is that the paramount mission of all scientists is to be charged with doing good. No matter what articles of faith obstruct the path. No matter how far we have to step from the comfort of classrooms, hospitals, laboratories, and campuses. Scientists must be constructive participants in the communities that we are privileged to serve, and do this in a spirit of humble partnership, walking and working together, shoulder to shoulder.

The point is people

The face of Nakala kept returning to me in the months that passed — throughout the stressful and contentious remainder of 2015, throughout the lawsuits, charges and trials that unfolded after the Flint Water Crisis was exposed.

Nakala’s face still comes to me now, three years later, whenever I’m asked if the tap water in Flint is finally okay to drink.

Speaking science to power should be part of the mission of the doctor, the researcher, the academic — all scientists everywhere. Disrupting the status quo for disruption’s sake alone is not enough. We should be elevating human life and protecting the environment.

Scientific education, be it in medical, engineering, natural or physical science, often misses this point. Graduate schools and scientific organizations tend to educate, and only educate, and wait for others to blow the whistle in the name of public health and the environment.

The point of a science education — any science education — should be about people. And not en masse, as a statistic, but person to person. It should be about benefiting lives, doing good, improving outcomes. There should be more training in communications, public speaking, and policy-making so scientists can be better communicators and advocates of our discoveries and the benefits.

Inclusion and diversity are a critical part of all this, not just as restorative justice, but as a means to connect to the higher purpose of science, which need to benefit more people and more places. The recipients of scientific advances have to extend beyond the rich and white. And the injustices associated with industry and technology must also not fall disproportionately on the poor and brown.

Science was an integral part of what happened in Flint – and is still happening. Ignoring science was a cause of the water crisis. Embracing science was how the fight to reveal the lies and cover-ups was won. And now, it is leading the city to solutions. The emerging science of child development and brain plasticity are helping to build resilience in our kids like Nakala, to buffer the impact of the crisis and create a playbook of hope for children everywhere.

My hope is that Flint will serve as a lesson of the consequences of science denial, and also of the incredible power that science and scientists hold – in beakers and at the bedside – to be catalysts for good.

Mona Hanna-Attisha is a pediatrician, scientist, and professor in Flint, Michigan. She is the founder and director of the Pediatric Public Health Initiative. She is author of the 2018 New York Times 100 Notable Book and NPR’s Science Friday Best Science Book of 2018, “What the Eyes Don’t See: A Story of Crisis, Resistance and Hope in an American City.”

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

EPA Head Lies about Fuel Economy Fines in Push for Weaker Car Standards

UCS Blog - The Equation (text only) -

Fiat Chrysler spinning itself in circles as it chooses to pay fines and buy credits from competitors instead of investing in efficiency for the long-term. Photo courtesy of FCA

In an interview with Bloomberg Media on February 4th, EPA Acting Administrator Andrew Wheeler stated that manufacturers have paid $77 million in fines for not complying “with the current Obama numbers,” going on to say that “it’s incorrect to say that the automobile manufacturer can comply with the Obama numbers. We want a more realistic number.”

In waging this war on “the Obama numbers”, Andrew Wheeler is waging a war on facts in order to increase pollution from passenger cars and trucks and force consumers to pay more at the pump, lining the pockets of the oil industry with whom he has met repeatedly in his short tenure at EPA.

Fiat-Chrysler is paying a fine…for repeating history

One part of the story is correct: Fiat Chrysler is paying $77 million in fines as a result of the inefficiency of its 2016 model year fleet. However, this fine is not because they aren’t in compliance the critical and important standards set under Obama (in fact, they are). Instead, Fiat Chrysler is paying a fine for violating a Congressional law meant to prevent the very actions which set in motion the bailout of Chrysler!

In the 2000s, Chrysler and other domestic manufacturers had invested heavily in SUV and light truck production in the United States, essentially ignoring investment in passenger cars. When oil prices rose, they were completely unprepared for the market shift away from these big gas guzzlers and towards the more efficient passenger cars made by their competitors. The result of this negligence were massive layoffs of domestic workers.

In 2005, General Motors announced the closure of 12 manufacturing plants, resulting in the loss of 30,000 jobs across North America. In 2006, Ford announced eliminations of up to 30,000 jobs and 14 factories. In 2007. Chrysler announced cuts to 13,000 jobs in North America and at least partial closures of 4 plants. This massive economic catastrophe was the result of a business strategy that ignored the possibility of a changing market and the inherent fluctuations resulting from a volatile oil market, putting short-term profits over smarter, longer-term investments.

Congress says enough is enough

The long-reaching impact of these lay-offs is apparent in Michigan today, even as many of these jobs have returned. And in 2007, Congress sought to put an end to the detrimental behavior that cost the public so much.

In the 2007 Energy Independence and Security Act (EISA), Congress set a mandatory limit for a manufacturer’s domestically produced passenger car fleet—no longer would a manufacturer be allowed to ignore investment in a robust portfolio of efficient vehicles produced in North America. In order to make sure the bailouts, layoffs, and economic turmoil brought about by shortsighted investment strategies, the law requires that every manufacturer’s domestically produced passenger car fleet achieves an average fuel economy no more than 8 percent worse than the average car sold in the United States. .

Fiat-Chrysler tells the American people: We don’t care

In 2007, Congress tried to prevent future crises by passing EISA. And in 2008, the American taxpayer bailed out Chrysler and General Motors, assuming that these companies had learned their lesson. But just seven years later, Fiat Chrysler flaunted the requirements set out by Congress to avoid another bail-out and protect American jobs.

In 2015, Fiat Chrysler knew that it was going to fall well short of the requirements on North American production of efficient passenger cars. Yet in 2016, the company doubled down on its strategy, not only refusing to improve the efficiency of its domestic fleet but scrapping production of its  most efficient vehicles entirely. This was a conscious and deliberate choice to ignore Congress and the goodwill of the American people in bailing out the failing company by repeating history. The penalty for doing so was a $77 million price-tag they were willing to pay.

Fiat-Chrysler is being fined because they are falling short of their competitors and focusing on short-term gains in place of long-term investment, exactly the behavior the law they broke was meant to combat.

EPA wants you to believe that manufacturers can’t meet standards…and so does Fiat Chrysler

Andrew Wheeler’s EPA is in the process of undoing the regulations that have continued to make vehicles of every size and type more efficient. He’s already claimed ridiculous things about the impacts of these rules, but now he’s adding a new weapon in his quest to harm American consumers: lying about whether manufacturers are in compliance with these standards.

In fact, the document published by NHTSA disclosing these fines shows quite clearly that manufacturers continue to comply with the CAFE program. Even his own EPA shows that manufacturers continue to comply with the program, in part by using credits earned for exceeding expectations in the early years to buy more time to comply with the harder future standards, for which they’ve prepared a number of widespread developments, whether that’s mild hybridization of some of the largest vehicles on the road, deployment of dozens of new electric vehicles, or just the “holy grail” of internal combustion engines.

Fiat Chrysler has been quite clear about how it feels about regulations—it would rather pay the U.S. government fines than provide customers with efficient options. And it has lobbied Andrew Wheeler for a dramatically weaker program in order to continue doing so. But one company’s strategic indifference to fuel economy improvements are not justification to rollback a program that is cutting global warming emissions, reducing our use of fossil fuel, and saving consumers billions at the pump.

Minnesota Bill HF700 Considers Bold Carbon-Free Energy Target

UCS Blog - The Equation (text only) -

Photo: Tony Webster/Flickr

Last week Minnesota Representative Jamie Long (DFL – Minneapolis) introduced HF700, a bill laying out a bold plan to achieve 100 percent carbon-free energy for the state.  Last Tuesday an informational hearing was held in the House Energy and Climate Finance and Policy Division, where dozens of Minnesotans testified in support of the bill. They stressed the need for Minnesota to be a national leader on clean energy, and the dire consequences of waiting to act on climate change.

In 2007, Minnesota passed the Next Generation Energy Act which set a 25 percent Renewable Energy Standard (RES) by 2025 (30% for Xcel Energy), and set a carbon reduction goal of 80 percent by 2050. Minnesota has met its RES goal 7 years ahead of schedule, but the state is not on track to meet its carbon reduction goal.

So, what’s in HF700, and what would its passage mean for Minnesota?

What’s in the bill

HF700 calls for electric utilities in that state to get 55 percent of their power from renewable sources by 2030, 80 percent by 2035, and to go 100 percent carbon-free by 2050.  The state’s largest investor owned utility, Xcel Energy, has a higher standard to meet of 55 percent renewable energy by 2026, 60 percent by 2030, 85 percent by 2035, and 100 percent carbon-free by 2045.

The bill adds a definition of carbon-free to the RES, defined as a technology that generates electricity without emitting carbon. However, it doesn’t state which specific technologies would qualify as carbon free, such as nuclear energy or Carbon Capture and Storage (CCS). This is relevant because two of Minnesota’s nuclear plants, owned by Xcel Energy, have operating licenses set to expire in the 2030s.

The bill removes trash incineration from the definition of renewable energy sources and incorporates environmental costs as a factor that the Minnesota Public Utilities Commission (PUC) must consider if a delay in meeting these benchmarks is requested by a utility. The bill also directs the PUC that in evaluating a utility’s claims of transmission capacity constraints, it must consider whether the utility has taken all reasonable measures to meet the requirements with renewables.

The legislation also includes an expanded section on local benefits, including important equity considerations like directing the PUC to ensure equitable implementation in the following areas:

  • The creation of high-quality jobs in Minnesota paying wages that support families;
  • Recognition of the rights of workers to organize and unionize;
  • Ensuring that workers have the necessary tools, opportunities and economic assistance to adapt successfully during the energy transition, particularly in communities that host retiring power plants and contain historically marginalized and underrepresented populations;
  • Ensuring that all share the benefits of clean and renewable energy and the opportunity to participate fully in the clean energy economy;
  • Ensuring that air emissions are reduced in communities historically burdened by pollution and the impacts of climate change; and
  • The provision of affordable electric service to Minnesotans, particularly to low-income consumers.
 Alignment with Xcel’s recent commitments

This bill fits nicely with what Xcel has already committed to and allows the company a lot of flexibility in achieving their long-term goals.

In December Minnesota’s largest investor owned utility Xcel Energy announced their commitment to 100% carbon-free electricity by 2050. Within their plan Xcel has also set the near-term goal of 80% reductions by 2030.

With existing technologies such as wind, solar and energy efficiency, Xcel will be able to achieve their near-term goal.  Additionally, energy storage costs are expected to continue to decrease, which will allow for even more renewable energy penetration. Their long-term goal will be harder to achieve and, in their words, will require technologies that are not currently cost effective and commercially available today.  However, they are committed to ongoing work to develop advanced technologies while putting the necessary policies in place to achieve this transition.

My colleague gives a variety of potential pathways for how they’ll be able to get all the way to zero, including unlocking the full potential of dispatchable renewables, energy efficiency, flexible demand, energy storage, and other technologies such as carbon capture and sequestration.

Learning from our friends to the west

California passed a similar bill in 2018, setting a bold goal of 100 percent carbon-free electricity by 2045 and increased the renewable energy standard from 50 to 60 percent by 2030. UCS developed 10 key strategies that state policymakers and stakeholders could follow to achieve these goals and make the electricity grid more flexible while reducing fossil fuels.

Minnesota could benefit from implementing similar strategies. The  key strategies to achieve 100 percent carbon-free electricity include:

  • Using electricity as efficiently as possible to reduce peak demand;
  • Generate renewable energy from a diverse mix of resources;
  • Plan for an equitable transition away from fossil fuels, including natural gas;
  • Use renewables to provide grid reliability services;
  • Invest in energy storage;
  • Unlock the value of distributed energy resources;
  • Electrify cars, trucks, and buildings;
  • Shift electricity demand to better coincide with renewable energy production; and
  • Promote high-quality jobs and workforce development.
What’s Next?

HF700, and the Senate companion bill SF850, are a powerful move in the right direction for Minnesota. UCS will be working with our coalition partners to push the state’s utilities to look at all their options and make sure that cost-effective measures, like energy efficiency, are used in the path to 100 percent.

The Senate version awaits action by the Senate Energy and Utilities Finance and Policy Committee.  The bill may face an uphill fight in the Senate, but a hearing would be a great first step. Now is the time for us to move toward with 100% carbon-free and equitable energy that benefits all Minnesotans.

Photo: Tony Webster/Flickr

How the Government Shutdown Shredded Indian Health Services

UCS Blog - The Equation (text only) -

Photo: Wikimedia commons

The end of the partial government shutdown last month could not have come soon enough for Native Americans. For them, shutdown equaled full insult.

In President Trump’s attempt to wall off brown immigrants, the shutdown walled in America’s first peoples. That became clear with every report coming out of Native American lands as the president held federal services hostage to his fantasy of a border wall against Latin America.

In Navajo Nation in the Southwest, leaders told the New York Times that snowed-in roads went unplowed for days because Bureau of Indian Affairs staff were furloughed. The conditions blocked residents from making essential trips of up to 100 miles to procure basic needs, including medicine, until unpaid Navajo Nation employees were able to finally clear the roads.

Also in Navajo Nation, the Associated Press wrote about a 68-year-old woman who underwent eye surgery but could not get a referral from furloughed Indian Health Service (IHS) staff to deal with high pressure in her eye. In Minnesota, the Red Lake Nation told the Minneapolis Star Tribune that it suspended construction of a dialysis center.

On January 10, Native American Lifelines, an IHS health care contractor, announced the immediate suspension of dental services, and curtailment of financial assistance programs and ride share services, except in emergencies. Lifelines Executive Director Kerry Hawk Lessard told MedPage Today that the suspension meant halting addiction counseling and advocacy for lonely seniors hospitalized with severe illnesses. She also told the Washington Post: “We have thus far had to deny purchase of care requests that are critical to chronic-care management—insulin, blood pressure medication, thyroid medication and antibiotics—thus impacting the quality of life for the individuals we serve.”

Another unjust abrogation of the federal government’s obligations

During the shutdown, which ended January 25, 60 percent of the IHS workforce—some 9,000 employees—worked without pay to serve 2.3 million Native Americans and Alaska Natives. The Huffington Post featured Anpetu Luta Hoksila, a 35-year-old Native American psychologist from the Indian Health Service who is considered by the government to be “essential,” and thus is providing care without pay in Arizona. Hoksila said if the shutdown persists, he will quit the service to become a barista to pay the bills, noting: “On some level, it’s kind of pitiful, but I don’t care.”

It is more than pitiful what the shutdown did to Indian Country. It was yet another unjust abrogation of the federal government’s obligations to its original peoples. Just before the shutdown began last month, the US Commission on Civil Rights issued a report appropriately titled, “Broken Promises.” It said that despite the 375 treaties signed by tribes that were supposed to result in unique federal support of services in exchange for forced removals and relinquishing of land, the US government has “chronically underfunded” Native American health programs.

Despite chronic health disparities of many kinds suffered by Native Americans, per-person health care expenditures for the Indian Health Service (IHS) in 2017 was only a third that of federal health care spending nationwide. By further comparison, the National Tribal Budget Formulation Group, which makes recommendations to the IHS, said in a report last April that the budget of the Veterans Administration is 14 times that of the IHS while serving only four times the population.

This is particularly appalling since this is old news. The same commission published a report in 2003 that said, “The federal government spends less per capita on Native American health care than on any other group for which it has this responsibility, including Medicaid recipients, prisoners, veterans, and military personnel.” The underfunding has persisted for so long that the backlog for new health care facilities in Indian Country has reached $10 billion. To achieve health care parity, the tribal formulation group and the National Congress of American Indians say the current annual agency budget of a little over $5 billion needs to be increased to $32 billion by 2030.

“Our Indian communities are combating ongoing historical trauma not unlike that of untreated PTSD due to war experiences,” the tribal budget group said. “We have patients who have lost limbs due to untreated diabetes or unintentional injuries associated with the third world environments in which we live. Health care is rationed and expectations for quality care in outdated facilities and equipment are so low that patients have nearly lost all hope . . .The message is clear: the Indian Health System has failed its mission.”

“It’s like doing your job with both hands tied behind your back and blindfolded.”

The shutdown is a dangerous reminder of how thin the agency is stretched in normal times. Last year a report from the Government Accountability Office found that on average:

  • One out of every three and, in some areas, every other IHS physician position goes vacant
  • A quarter of nursing positions are vacant, and up to a third in some areas
  • A third of nurse practitioner positions are vacant, and up to half in some areas

Mary Smith, an acting director of the IHS during the Obama administration, and currently a health care consultant and secretary of the American Bar Association, fears that the shutdown exacerbated the difficulty the service has in attracting committed talent (a parallel to the multitude of federal scientists who wonder if government service is worth the current political instability). Despite its struggles, the service has successfully engaged communities to cut kidney failure from diabetes 54 percent since 1996.

“Forget doing your job with one hand tied behind your back,” said Mary Smith, an acting IHS director during the Obama administration. “It’s like doing your job with both hands tied behind your back and blindfolded.”

During the shutdown, the system’s already grave problems led the Grand Traverse Band of Ottawa and Chippewa Indians in Michigan to issue a press release saying, “People will die because of the shutdown.” The shutdown fortunately ended before reaching that point, at least based on a lack of news reports, but with disproportionate levels of dire illness, distance to care, and under-funded programs that the shutdown ceased altogether, it is much too easy to consider scenarios where another shutdown indeed could cost lives. If there were to be another shutdown, it is imperative that health services to Native Americans be exempt from closure.

In an ominous double-whammy of an example, the Associated Press cited the case of Michelle Begay, who was furloughed from an administrative position at the IHS, then had no health insurance because her application was held up by the shutdown. Begay came down with bronchitis. She paid $600 out of pocket to be treated a first time. When the bronchitis came back, she had to call for three days to get an appointment at an IHS clinic. She told the AP, “I was very fortunate. My situation was treatable. My lung didn’t collapse, that’s what they were really concerned about. But, still, I had to wait two, almost three days to be seen.”

Begay waited at risk of lung collapse, a psychologist threatened to become a barista, and a senior could not get her eye pressure checked, all because Trump wants to erect a wall.

For decades, Native Americans have faced a wall to better health. It’s past time to tear that wall down.

Photo: Wikimedia commons

Big Dairy Is Looking to Sell More Milk—and a Perception of Better Health

UCS Blog - The Equation (text only) -

The dairy industry has been busy lately. Or should I say, “Big Dairy,” a powerful collective of deep-pocketed lobby groups including the International Dairy Foods Associations and multinational corporations like Land O’Lakes and Dean Foods. In total, these and other big industry players spent $7.4 million on lobbying during 2018—and the payoff is showing up in various new government policies.

Just since December, for example:

  • The 2018 Farm Bill included a new section, “Healthy fluid milk incentives projects,” which authorizes projects that would boost milk sales among Supplemental Nutrition Assistance Program (SNAP, or food stamps) users.
  • The Food and Drug Administration concluded a public comment period on whether it was acceptable to use terms like “milk,” “yogurt,” or “cheese” on the labels of plant-based dairy alternatives, such as soy milk or almond milk—whose markets are rapidly expanding.
  • And now, House Agriculture Chairman Collin Peterson (D-MN) has teamed up with Representative Glenn Thompson (R-PA) to introduce a bill that would roll back school nutrition regulations by allowing schools to serve full-fat flavored (read: sweetened) varieties. This builds on a rule published by the USDA late last year that allowed low-fat flavored milk, rather than just fat-free flavored milk, in schools.

Is whole milk bad for our health? Maybe not, suggests emerging research. But this legislation, much like the others, isn’t about health. It’s about scoring simultaneous wins for Big Dairy and the sugar industry, who see the 30 million students across the country as a receptive audience for more of their products, and full-fat chocolate milk as a good way to deliver them.

These policy changes are responses to a multi-year crisis facing dairy farms of all stripes, which has had real and lasting consequences for farmers across the country.

But is pushing more milk really what’s best for struggling small farms—and is it really what’s best for our health?

More milk could keep Big Dairy in a cycle of subsidies—and won’t do small farms any favors

There is no mistaking the severity of the US dairy crisis that has been building for more than a decade. A steady flow of federal farm subsidies have driven overproduction and resulted in tremendous price drops, creating an environment in which only industrial dairy farms are likely to survive. Between 1970 and 2017, the United States lost nearly 94 percent of its dairy farms, with surviving farms trending toward more cows and higher milk production. In 2017, the state of Wisconsin alone lost 500 dairy farms. To make matters worse, dairy farmers were caught in the middle of last year’s trade wars, as Mexico and Canada responded to US tariffs with tariffs on a number of dairy products. While the government offered farmers a bailout program to cushion the blow, for most, it was too little and too late. As many farmers continue to face the reality of losing their livelihoods, the outcomes are nothing short of tragic.

Milwaukee Mayor Tom Barrett talks with a local dairy farmer. In 2017, the state of Wisconsin lost 500 dairy farms. Photo: barrett4wi/CC BY SA 2.0 (Flickr)

But this isn’t the first time our agricultural system has been confronted by a crisis of overproduction, and it certainly isn’t the first time we’ve tried to remedy it by strengthening subsidies and expanding markets, rather than by limiting production. And history has shown us that this doubling-down strategy can leave farmers unwittingly trapped in a perpetual cycle of high production and low prices that really only works for Big Dairy. As retired Wisconsin dairyman Jim Goodman wrote in a recent Washington Post op-ed, “Farmers don’t want subsidies. All we ever asked for were fair prices.”

But what has given this political strategy some degree of cover is the notion that increasing dairy sales is a win-win, with the underlying message that more dairy is good—even essential—for our health.

Is that really true?

Milk may not be essential to health

Given the ways that milk has been integrated into the fabric of our federal food programs, it would be natural to assume that the science is settled on its health benefits. Milk is a required component of every federally subsidized school breakfast and lunch, as well as after-school and summer meals; is included in food packages for the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC); and even has a designated place on USDA’s MyPlate. The Dietary Guidelines for Americans recommends that each of us consume about 3 cups (or cup equivalents) of dairy per day.

But in reality, the science isn’t quite so straightforward.

In fact, existing evidence has led groups like the American Medical Association to adopt the position that both meat and dairy products should be optional components of the diet, while both Harvard’s Healthy Eating Plate and Canada’s dietary guidelines recommend water—not milk—as their beverage of choice.

Milk does contain key vitamins and minerals such as calcium, potassium, and (when fortified) vitamin D—all nutrients we’re not getting enough of in a typical diet. And some studies have shown that dairy intake is associated with reduced risk of certain chronic diseases, including cardiovascular disease. But dairy isn’t the only place we can find these beneficial nutrients. Certain types of fish, beans, leafy greens, and tofu offer calcium; a variety of fresh fruits and vegetables provide ample potassium; and fatty fish and other fortified foods are good sources of vitamin D. (Of course, there’s also the sun.) According to the Dietary Guidelines for Americans, most of these are foods that we generally under-consume, and eating a diet with more of them would come with its own health benefits.

We also know that many people have an impaired ability to digest milk, a condition known as lactose intolerance. It’s estimated that about 36 percent, or just over a third, of all people in the US have lactose intolerance, with higher rates among African Americans, American Indians, Asian Americans, and Hispanics and Latinos. If dairy is truly a necessary component of the diet, a lot of us are in trouble.

What does it all mean?

Enjoying a tub of ice cream, circa 1995.

Do you like dairy? Nice. Me too.

Do we need to stop eating it? No. (Although there are cases to be made for eating less.)

Does that mean that the dairy industry, rather than public health, should set our policy agenda? Absolutely not.

The bottom line is this: the science may yet be unsettled on dairy, but we can say with certainty that it’s not good for any of us when our public policies are shaped by industry—least of all by Big Dairy.

Let’s look for a different kind of win-win—one that will benefit real family farmers more than multinational corporations and provide the public with reliable information about our dietary choices. It’s about time.

 

Self-Driving Cars Need to be Steered in a Climate-Smart Direction

UCS Blog - The Equation (text only) -

Electric AVs being tested by Cruise Automation frequently pass by my home in San Francisco. Photo: Don Anair

The roving autonomous vehicles on the streets of San Francisco are one of the frequent reminders on my daily commute that our transportation system is changing. But will self-driving cars be good or bad for climate change?

Imaginations can run wild with “heaven or hell” scenarios of automated cars.  Imagine zooming around uncongested roads and highways while passengers attend to their social media, relax with friends, or take in a movie in a clean, electric vehicle.  Or, in the darker vision, zombie cars with no passengers are clogging roads and spewing pollution, urban sprawl is given a new life, and marginalized communities continue to lack good transportation options. As this technology comes to market, it will be up to decision makers to set us on the right course with smart policies.

Some researchers have been putting pen to paper to better understand the potential climate risks of self-driving cars (or autonomous or automated vehicles (AVs) as they are otherwise called) as well as their potential climate benefits. This research is providing important insights into the potential for building a modern transportation system that is less polluting, less congested, more equitable and more efficient than what we have today. It also highlights the significant risks of inaction and the difficulty of achieving the best outcomes.

3 Revolutions and a Multi-Modal Future: Autonomous, Electric, and Sharing Rides

Let’s start with the positive vision first. Self-driving car technologies are paired with electric vehicles, which we’ve shown have lower carbon emissions no matter where you live in the U.S.  In addition, AVs usher in a new wave of transportation services—think Uber and Lyft 2.0—where rides are more convenient than individual vehicle ownership and are cost-competitive.  This leads to a reduction in personal car ownership, since not owning a car is now a more viable, cheaper option for households.  Reduced car-ownership alone doesn’t solve the problem, but when paired with increased access to mobility options like shared bikes, scooters, and efficient mass transit, individuals now choose from a variety of options for each trip, rather than always defaulting to the car formerly parked in their driveway. Sharing or pooling of rides is seamless and offers a lower-cost option, access to faster moving car-pool lanes and lower tolls, while reducing the number of cars on the road.  This ideal future of clean, equitable, and accessible mobility is one of autonomous, electric, and pooled car trips combined with urban design and infrastructure that supports walking, scooters, bikes, and mass transit, and pricing signals that steer choices towards the cleanest, most efficient modes of travel.

Figure 1 Adapted from “Three Revolutions in Urban Transportation“, 2017.

What happens to climate emissions in this future? Researchers at University of California Davis and Institute for Transportation & Development Policy examined a future scenario where AVs are incorporated into a highly shared, multimodal, and electric urban transportation system.  They found, globally, urban transportation pollution could be reduced by 80 percent by 2050 and massive increases in congestion could be avoided, with vehicle miles traveled actually declining by 25 percent instead of increasing by 50 percent in the business as usual case (see figure).

This scenario of a future transportation system meets the travel demands of a growing population while driving down climate emissions.  And it requires coordinated policies to work, including compact development as well as policies that make the lowest emission and most efficient modes of transport the most attractive.  But what if that’s not what happens? What if we don’t make the decisions necessary to support the future described above, and instead take a hands-off approach to AV deployment?

The nightmare AV future: More vehicle miles, more congestion, more pollution, less equity

As wonderful as the vision of “three revolutions” is, it would be foolish to think that this vision of the future is likely—or even possible—without a lot of work. Here are a few ways that things could go wrong.

AVs could dramatically increase driving

If AVs primarily enable increased single occupancy vehicle trips, we are in trouble. One widely-cited study looked at a wide range of impacts AVs could have on energy consumption, travel and carbon emissions.  And there are many factors (see figure). Everything from the energy savings of robot eco-driving to energy and travel increases from newly empowered individuals who previously did not have the ability to drive their own vehicle. There are several potential impacts on both sides of the ledger, but the biggest potential increase in energy use (and by association, emissions) comes from a behavioral response to AVs.  If driving can now be productive time, longer commutes, for example, may not be the burden they once were.  This is one way in which AVs could reduce the time-cost of driving (see “travel cost reduction” results in the figure) and increase overall vehicle travel – as much as 60% according to the study.  Recent modeling of possible AV deployment in the Washington, D.C. metro region showed similar results, estimating that vehicle miles traveled could increase 46-66% with the introduction of self-driving cars.

So will people really drive that much more? Some researchers did an experiment to see what would happen to a household’s vehicle travel if they had access to a vehicle and a driver for a week – mimicking life with a self-driving car. Not surprisingly, most households used the vehicle more often (83% average increase in miles traveled), and even sent the car and driver out on errands (21% of the increase was zero-occupancy).  While there were only 13 participants in the study, which limits the generalization of the findings, the experiment does illustrate the potential behavioral shifts when a vehicle that can drive itself is introduced into a household. Why not send the car to pick-up your dry cleaning or take that trip to Aunt Esmerelda’s you’ve been putting off?

AVs could increase congestion and undermine transit, instead of complementing it

Pooling rides is essential to making AVs deliver on their potential to be clean, equitable and efficient.  Pooling rides for people with similar origins and destinations can deliver more passenger trips from fewer vehicle trips, which is key to making efficient use of vehicles (reducing pollution per trip) and roads (reducing congestion per trip).  But while pooled AVs could help increase the average occupancy of cars, they could also undermine our most important current source of pooling, mass transit.  A car with 2-3 people sharing a ride is an improvement over each person driving alone, but it is a lot more vehicles, pollution and congestion than 30 people in a bus, or several hundred in a subway or train.

Based on the current evidence, especially in larger cities where mass transit is especially important, ride-haling is pulling more people from modes like transit, walking and biking than it is pooling passengers who would otherwise drive alone. This mode shift, along with additional trips that that wouldn’t have been made in the absence of ride-hailing options, is leading to increases in congestion and increased vehicle miles traveled.  (See research by Clewlow & Mishra, Schaller, and University of Kentucky) Moreover, reduced ridership on mass transit hurts the economics of these critical systems as they lose fare revenue.  Adding AVs to ride-hailing fleets could drive down ride costs and exacerbate the changes in vehicle travel and transit impacts we are already seeing.

Roads snarled in congestion are not a good outcome for anyone, including companies that want to use these roads to sell people rides, pooled or otherwise.  So, new rules and incentives will be needed to efficiently manage transportation networks as private companies operate what are in effect private transit systems with occupancy sometimes higher than today’s cars but most often lower than today’s mass transit. Policy-makers will need to prioritize the movement of people over vehicles with policies that favor higher occupancy trips and modes. These could  take the form of preferential pricing, access to restricted lanes and ensuring that the financial model of mass-transit adapts along the way

If we don’t succeed in ensuring rides are largely pooled in both cars and in mass transit modes like rail and subway, not only will congestion get worse, but we will fail to reduce climate emissions to safe levels as electrifying our transportation system is simply not enough.   In the UC Davis/ITDP study, a “2 Revolution” scenario with AVs and widespread electrification but WITHOUT significant pooling of trips resulted in emissions reductions globally in 2050 by only 45% – far less than needed to stabilize our climate.

AVs could exacerbate or perpetuate inequities in our current transportation system

A new report by The Greenlining Institute outlines strategies to ensure AVs benefit all communities.

Our current car-ownership-based transportation system does not serve all communities in an equitable way.  Lower income households spend a larger share of their income on transportation than wealthier households. Those who cannot afford a car, or are too old or young to drive, or have physical handicaps to driving, have to rely on a transit system that often doesn’t meet their needs.

AVs could improve mobility for communities historically underserved by our current transportation system – if the technology enables greater access to affordable, accessible and reliable transportation.  If, however, AV technology is primarily relegated to private car ownership and leads to increased congestion or undermines public transit, as described above, the current inequities will be exacerbated.

A new report by the Greenlining Institute describes in more details the health, economic and mobility risks of AVs for marginalized groups like people of color, the poor, the elderly, and those with disabilities, and offers a list of recommendations to policymakers for ensuring the rollout of AVs leads to greater mobility options for all. UCS will also be releasing a report soon with results from an analysis of the Washington DC metro area and how the rollout of AVs in that region could impact transportation equity.  This research is important for informing the policies necessary to maximize the benefits of self-driving technology.

Now’s the time to get on the right path

Research is providing some helpful insights on understanding the potential role of AVs in a transportation system that cuts climate emissions and improves mobility.  It also offers a cautionary tale of the potential for AVs to dramatically increase emissions and exacerbate congestion if decision makers are not proactive and thoughtful about putting in place the policies that will lead us to the best outcomes.

We are starting to see some positive action on this front.  In California, legislation (SB1014)signed into law last year requires state agencies to develop standards to ensure ride-hailing companies are moving towards greater shared, zero-emission trips. Since AVs are likely to be rolled out in ride-hailing services, these rules will affect AV deployment.  But that’s only a drop in the bucket. Developing effective public policy to ensure AVs deliver climate and transportation system benefits requires shared goals, effective interagency coordination, and development and implementation of effective policy at different levels of government.  In California, UCS is sponsoring legislation with CALSTART (SB59 authored by Senator Ben Allen) that would get the ball rolling at the state level and ensure proactive policies can be deployed as AV technology is hitting the street.

Smart policies are critical for ensuring self-driving car technology ushers in a new era of clean, affordable, and efficient transportation rather than the zombie car apocalypse.  AVs may be able to drive themselves, but it is up to us to steer them in the right direction.

Photo: Don Anair

What to Watch for in Michigan’s State of the State Speech

UCS Blog - The Equation (text only) -

Photo: Terry Johnston/Wikimedia Commons

Next Tuesday, Governor Gretchen Whitmer will give her first State of the State address as Michigan’s chief executive officer. It is a key opportunity for her to address climate change, infrastructure needs, and clean energy and water—all priorities Governor Whitmer emphasized during last year’s campaign.

Here’s what to look for.

Michigan Governor Gretchen Whitmer

Joining the U.S. Climate Alliance

The U.S. Climate Alliance is a group of states committed to upholding the objectives of the 2015 Paris Agreement on climate change. Look for Governor Whitmer to highlight this week’s executive directive adding Michigan to the Alliance and the growing number of states in this coalition, which includes Minnesota and now Illinois whose new Governor J.B. Pritzker announced would also join.

Creating a state office of climate change

Governor Whitmer is also likely to highlight another executive directive from this week creating a Michigan Office of Climate and Energy. This new office will work with the governor to mitigate the impacts of climate change, reduce greenhouse gas emissions, and embrace more sustainable energy solutions.

The sooner this new office can be up and running, the better, especially in light of the urgent and compelling need to act on climate change outlined in two key scientific reports released last year.

Infrastructure investments, clean water, and electric vehicles

Governor Whitmer’s campaign focused in on the need to improve Michigan’s infrastructure, including electric and heating systems in addition to roads, bridges, and clean drinking water. She also promised to “mak[e] sure Michigan has the edge in electric vehicles [that] will not only reduce carbon emissions, but create and protect jobs here in our state.” Ideally Governor Whitmer will outline specific goals and a policy agenda to further these critical needs and opportunities during her State of the State address.

Michigan is primed for further clean energy growth

The Interstate Renewable Energy Council recently named Michigan to its 2019 Clean Energy States Honor Roll as its “Emerging Clean Energy Leader.” This is well-deserved as the Michigan Public Service Commission has launched several stakeholder processes to address regulatory policies facilitating integration of solar power in the state, including community solar and rooftop solar.

In addition, last year Michigan’s two major electric utilities both announced important carbon reduction and clean energy goals. Many of the state’s old and inefficient coal-fired power plants have been retired, and there are plans to close additional polluting facilities in the coming years. Both Consumers Energy and DTE Energy have integrated resource plan dockets filed or to be filed in 2019 that, as I wrote about in my blog post last month, will be key items to watch on how the utilities plan to follow through on their goals.

As a candidate, Governor Whitmer signed on to the Clean Energy for All Campaign, which asks candidates to commit to a vision where the United States runs on 100 percent clean energy by 2050.

Governor Whitmer’s leadership on renewable energy and energy efficiency can help build on the state’s clean energy momentum and ensure Michiganders are benefiting from cleaner air and water, more affordable energy bills, and expanded economic development. I look forward to hearing how her remarks on Tuesday night will further the clean energy transition.

Photo: Terry Johnston/Wikimedia Commons

Kids and Mercury Don’t Mix. You Have 60 Days to Tell the Trump Administration.

UCS Blog - The Equation (text only) -

Photo: EPA/Flickr

EPA Acting Administrator Andrew Wheeler is true to his word. With a masterful sense of timing and irony, the EPA has announced that on Thursday, February 7, it will publish its proposal to revise (read: roll back) the agency’s own supplemental finding that it is “appropriate and necessary” to regulate mercury and hazardous air pollutants (HAPS) from coal and oil-fired power plants.

The new proposal’s bottom line: Regulating mercury and hazardous air pollution from power plants under the Clean Air Act is no longer “appropriate and necessary.” It’s just too costly, they insist. (For more details and background on this travesty, see my prior commentary here and here.)

The timing and the irony

Thursday’s publication of the proposal in the Federal Register opens a 60-day public comment period on what is just the administration’s latest effort to roll back public health protections. Indeed, the president proudly touted his efforts to cut regulations in his State of the Union speech earlier this week. Lost on the administration, but not lost on the public, is the fact that most regulations are actually safeguards that protect our health and safety, along with the cleanliness and safety of the air we breathe, the water we drink, the food we eat, the medications we take, the products we buy, and more.

The very night before the EPA’s announcement, the president was spouting compassion and concern for our health, and for children’s health in particular, which makes the timing of this move particularly ironic. Less than 24 hours after the president’s address, the EPA made good on its intention to no longer consider regulating mercury and hazardous air pollutants from coal and oil-powered power plants as “appropriate and necessary.” If the administration is seriously concerned about children’s health, why in heaven’s name take aim at limits on mercury emissions, a potent neurotoxin?

The EPA’s job is to protect our health and the environment. That’s its mission, which the agency still acknowledges it on its website. And the EPA is supposed to rely on the best available science when making regulatory and policy decisions. This proposal takes the opposite approach; EPA political leadership has decided to throw science and common sense out the window. The public interest seems relegated to the back seat, while powerful private and industrial interests take the wheel. It’s part of an ongoing pattern by the Trump administration during its first two years in office; you can see it all in our new report, “The State of Science in the Trump Era.”

Time to weigh in: Our voices matter

Acting Administrator Wheeler has been nominated to replace Scott Pruitt as the agency’s full-time administrator. His nomination has already passed out of committee by a narrow margin and will go to the full Senate for a vote sometime soon.

With this latest proposal, Wheeler, a former lobbyist for coal companies, has made it clear that he’s happy to sideline science and evidence when it doesn’t comport with a political preference. Let your senators know that this is not OK—and that you want, expect, and deserve an EPA leader who works for us and who respects and is guided by science when making agency decisions.

And with the public comment period opening on Thursday for this particular proposal, you now have 60 days to let the EPA know we are not fooled by its wonky little revision. It’s a crafty, nasty, and dangerous proposal that could roll back current safeguards and undermine future public health and environmental protections.

In the weeks to come, scientists and everyone concerned about public health will have a chance to comment on this important proposal. We’ll keep you updated on how you can most effectively weigh in and will continue to provide information and resources to help inform comments.

This will be a critical moment to raise your voice. The clock is now ticking.

Photo: EPA/Flickr

Don’t Scapegoat China for Killing the INF Treaty. Ask it to Join.

UCS Blog - All Things Nuclear (text only) -

September 23, 2016: Chinese UN Representative Liu Jieyi votes in favor of a UN Security Council resolution on the 20th anniversary of the signing of the Comprehensive Test Ban Treaty (CTBT) urging all parties to push for the treaty’s entry into force.

The Trump administration recently announced it intends to walk away from an important agreement that reduces the risk of nuclear war—the INF Treaty. US officials said concerns about China were an important factor in deciding to scrap a nuclear arms control pact intended to last in perpetuity. But there is no evidence the Trump administration consulted Chinese leaders about its plans to withdraw or the concerns that supposedly made it necessary.

The Soviet Union and the United States negotiated the bilateral agreement in the mid-1980s during an especially tense period when both sides were upgrading their immense nuclear arsenals. Wide-spread public protests in Europe and the United States helped push both governments to agree to eliminate at least one class of weapons: ground-based missiles with ranges between 500 and 5,500 miles.

Contemporary US critics of that agreement, including US National Security Advisor John Bolton, argue the United States must quit the treaty because China is not subject to the same restriction. That’s a dubious justification for tearing up the treaty, although persuading China to join has obvious value. Unfortunately, getting Chinese leaders to the negotiating table is a tough sell when, from their perspective, the entire US defense and foreign policy establishment is chomping at the bit to fight a new Cold War in Asia. But it’s only impossible if, like Mr. Bolton, you never really bother to try.

There is good reason to believe China is not opposed to arms control negotiations or unwilling to make significant concessions to arrive at an equitable agreement.

The Peril and the Hope

Even before atomic bombs were dropped on Hiroshima and Nagasaki, many of the scientists and a few of the politicians who understood the long-term implications sought to impose international controls. They recognized these weapons were different. As horrible as the last war had been, a war fought with nuclear weapons would be far worse. No nation or coalition of nations could win such a war. The entire planet might become inhabitable. Human civilization and most of the living things on earth could perish, forever.

China came late to the nuclear table but the impact of the weapons on the scientists who developed them was similar. Hu Side, a former director of China’s nuclear weapons lab, wrote, “I’ve seen the mushroom clouds rise, felt the earth and mountain massifs shake and experienced the shock of the tremendous energy released by a nuclear explosion. It is precisely because of these experiences that I particularly understand why national decision-makers determined our country’s nuclear weapons were a defensive measure for strategic deterrence.” It may lack the poetry of Robert Oppenheimer‘s “I am become Death, the destroyer of worlds,” but the sentiment is the same. Nuclear weapons are too powerful to be used to fight a war.

Until the late 1990s the nuclear arms race and efforts to stop it grew in tandem. Scientists rallied the public to restrain the self-destructive behavior of military and political leaders addicted to antiquated approaches to war and peace. But over the last thirty years the will to control the nuclear arms race has weakened while the addiction to antiquity has grown much stronger. This is especially true in US-China relations, where the most influential idea guiding US officials is “the Thucydides trap” and Chinese leaders propagandize “the Great Chinese Renaissance.”

The Beginning of the End

Ironically, international nuclear arms control began to die when China finally embraced it. The Comprehensive Nuclear Test Ban Treaty (CTBT) was the first international nuclear arms control accord China helped to negotiate. For decades, the Chinese Communist Party viewed nuclear arms control as vehicle for preserving the advantages of the Soviet Union and the United States. China lagged far behind in the nuclear arms race. By the time the negotiations reached their final stage China had conducted 47 nuclear tests and possessed several hundred nuclear warheads.  The United States had conducted 1067 tests and possessed approximately 15,000 nuclear warheads. Nevertheless, China signed the treaty.

After the Clinton administration failed to convince the US Senate to ratify the CTBT, progress in international nuclear arms control ground to a halt. Negotiations on a treaty to ban the production of the materials used to make nuclear warheads were cut short. The United Nations Conference on Disarmament (UNCD) became paralyzed; unable to reach a consensus on how to start negotiations on any arms control agreement.

The Bush administration made things exponentially worse when it unilaterally withdrew the United States from the Anti-Ballistic Missile (ABM) Treaty. The 1972 agreement was based on the common sense notion that both the United States and the Soviet Union would be safer if they limited missile defenses so that neither side would feel compelled to build new nuclear-armed missiles to overwhelm those defenses.

President Obama gave a nice speech in Prague and his administration managed to preserve the Strategic Arms Reduction Treaty (START) negotiated in 1991. But in order to get this New START agreement ratified Obama promised the Senate he would allow them to spend more than a trillion dollars to upgrade the entire US nuclear arsenal. And he steadfastly refused to even discuss a suggestion Chinese arms controllers felt was important: beginning talks in the UNCD on a new international agreement to prevent an arms race in outer space.

China does not look at nuclear arms control in isolation. Some forms of conventional weapons technology, like those involved in missile defenses, anti-satellite weapons and long-range conventional precision strike weapons impact Chinese decisions about the size and composition of its nuclear arsenal.

Back from the Brink

The most important thing about all forms of international arms control negotiations is that they bring adversaries together to talk. Dialogue builds trust. Trust that the other side isn’t trying to trick you into agreeing to something to gain an advantage. Trust that the other side respects you and is seeking an equitable agreement that reduces anxiety and the risk of war.

China has a small number of nuclear-armed ground-based intermediate range missiles that would fall under the original INF Treaty limits. But it also has a much larger number of conventionally armed missiles in this class that seem to be the major concern of US advocates of withdrawing from the treaty. Figuring out how to negotiate an expanded INF Treaty that would require China to dismantle them would introduce a number of new and difficult issues to resolve, but it could also lead to some very productive conversations on how to build trust and preserve the peace in East Asia.

Sadly, I suspect US advocates of killing the INF Treaty have no intention to talk to China about joining it, but if the United States wanted to open negotiations China is likely to put forward a few conditions.

First and foremost, the discussion on intermediate-range missiles would have to take place in the United Nations Conference on Disarmament. China must not be the only target of concern. Most if not all of the other nations that possess this class of weapon would have to be included. Chinese leaders prefer international rather than bilateral or multilateral forums for arms control negotiations. It’s not an unreasonable preference, and it predisposes Chinese negotiators to accept the general principle that restrictions should apply to everyone.

Unlocking the UNCD will be difficult because decisions are made by consensus—a norm for negotiations many cultures prefer. Consensus may require discussion of other arms control issues. Recent history suggests preserving peace in outer space may be one of them. Agreeing to begin discussions does not commit the United States to a particular outcome. It just creates an opportunity to talk. So broadening the agenda to satisfy all of the attending parties is not unreasonable either.

Finally, international arms control negotiations are not an apples-for-apples, oranges-for-oranges kind of thing. They’re an apples-for-my-pick-from-the entire-produce-aisle sort of thing. Different countries choose to rely on different weapons for all kinds of reasons, like geography. Because of its huge land mass and its concerns about the assemblage of conventional US forces on its periphery, China sees conventionally armed ground-based intermediate range missiles as an especially effective countermeasure. It’s invested decades of effort and substantial financial and technical resources in developing and deploying those missiles. Asking China to give them up is going to cost the United States something in return.

If the United States were serious about wanting China to join the INF Treaty, it would be talking with Chinese arms controllers about changes the United States might be willing to make in exchange for surrendering what Chinese military planners see as one of their most valuable military capabilities. There is no indication such a discussion has ever taken place. Until it does, China cannot be blamed for the US decision to kill the INF Treaty.

 

 

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs