Combined UCS Blogs

Mesothelioma Awareness Day: Our Past Must Dictate the Future

UCS Blog - The Equation (text only) -

It shouldn’t come as a surprise that asbestos isn’t good for you. The mineral is a known carcinogen and has been tied to thousands of deaths from mesothelioma, asbestosis, and other asbestos-related diseases. On average, close to 3,000 people each year in the United States are diagnosed with mesothelioma. And for those unfortunate enough to be diagnosed with the incredibly rare disease, the results are often not good. Patients are usually given a grim prognosis averaging somewhere between 12 and 21 months.

Asbestos-related diseases are rarely quick to present themselves, often taking decades before symptoms finally show. When you breathe in or accidentally ingest the invisible fibers, they enter the lungs and may lodge themselves deep into the lung lining, known as the mesothelium. The area becomes irritated and over the years tumors begin to form. Mesothelioma is often difficult to diagnose, which means the resulting cancer is caught later and treatment options are more limited.

Breaking down barriers

Armed with that kind of information, one would assume it’d be a slam dunk to phase out asbestos use in the United States. Unfortunately, that isn’t the case. Last year, roughly 340 tons of raw asbestos were imported into the US, primarily for use in the chlor-alkali industry. Some types of asbestos-containing materials can also be imported. The Environmental Protection Agency tried to ban asbestos use nearly three decades ago, but many of the rules established by the department were overturned in a resulting court decision two years later. Today there’s hope things could change in the coming years, including renewed interest from the EPA.

In 2016, Congress approved the Frank R. Lautenberg Chemical Safety for the 21st Century Act, amending the 40-year-old Toxic Substances Control Act (TSCA) and giving the EPA more power to regulate dangerous chemicals as they are introduced in an effort to more effectively remove those posing an unnecessary risk to public health. Chemicals deemed to pose an unreasonable risk during the evaluation process will be eliminated based on safety standards, as opposed to a risk-benefit balancing standard used under the previous TSCA requirements. What this means is that under the old TSCA, an unreasonable risk would require a cost-benefit analysis and any restrictions would have to be the least burdensome to addressing the risk. Under the Lautenberg Act, the “least burdensome” requirement is removed, though the EPA still needs to take costs of other regulatory actions and feasible alternatives into consideration.

The amendment also requires the agency to perform ongoing evaluations of chemicals to determine their risk to public health. In December, asbestos was included on a list of ten priority chemicals slated for evaluation and a scoping document for the mineral was issued in June. Problem formulation documents for each of the first ten chemicals are expected in December.

Drowning in red tape

Despite what the Lautenberg Act is doing to unshackle the EPA and allow it to properly regulate chemicals as it sees fit, the White House and Congress have taken actions that seem counterintuitive. For example, in January, President Donald Trump signed an executive order known as the “2-for-1 Order” forcing agencies to remove two existing rules for every new one they create. The risk here is that agencies like the EPA will have to pick which rules to enforce, creating a new series of public health concerns. When it comes to new hazards, the agency may be slower to react due to a new budget variable thrown into the mix. While it could help the agency identify rules that overlap others, it does create the risk of money taking precedence over public health.

In addition, the Senate’s recently introduced Regulatory Accountability Act, known in some circles as the ”License to Kill” Bill, poses a similar set of issues. If passed, the RAA could potentially resurrect much of the red tape that was removed by the Lautenberg Act. Once again, it would become difficult to regulate or ban chemicals in the future, despite dangers they may propose. For example, the EPA would have to prove that a full asbestos ban is the best option available to the agency compared to any other more cost-effective option. It also allows for anyone to challenge these decisions, which could delay a potential ruling for years or even halt the process entirely.

The EPA is also constrained by the people who have been appointed to several high level positions within the agency itself. Administrator Scott Pruitt sued the EPA 14 times, challenging rules he believes overstepped the agency’s boundaries. Deputy Assistant Administrator Nancy Beck, previously with the American Chemistry Council, lobbied for years against the very rules she has sworn to protect today. In 2009, Beck was criticized in a House report for attempting to undermine and create uncertainty regarding the EPA’s chemical evaluations while serving with the Office of Budget and Management for the Bush administration. The latest person nominated for an EPA position is Mike Dourson, who has, at times, proposed much less protective standards for chemicals than those in use by the federal government.

Where we stand now 

This Mesothelioma Awareness Day, we find ourselves one step closer to seeing asbestos banned in the US. Today, while we honor those who’ve lost their struggle against this disease, we also show support for those still fighting mesothelioma and refusing to give in.

The EPA has, once again, taken the first steps toward a potential ban, but until that day comes the need for more awareness is a never-ending battle. Mesothelioma is a misunderstood disease and asbestos isn’t something people might consider at work or at home, which is why educating others is so important. Mesothelioma is largely avoidable, but the need to remain vigilant to prevent exposure is paramount.

Asbestos exposure isn’t something that will come to a screeching halt overnight. Hundreds of thousands of homes, buildings, and schools still harbor the mineral and that is likely to be the case for years to come. But stopping the flow of raw and imported asbestos into the US is a great first step to combating the issue at large.

About the author: Charles MacGregor is a health advocate specializing in education and awareness initiatives regarding mesothelioma and asbestos exposure. To follow along with the Mesothelioma Cancer Alliance and participate in a MAD Twitter chat on September 26, find them at @CancerAlliance

Rebuilding Puerto Rico’s Devastated Electricity System

UCS Blog - The Equation (text only) -

Photo: endi.com

Over the last few days, I’ve been glued to social media, the phone, and ham radio-like apps trying to find out more about the fate of family members in the catastrophic situation in my native Puerto Rico following Hurricane María. (Fortunately, I was able to confirm on Friday that everyone in my immediate family is accounted for and safe).

My family is among the few lucky ones. My childhood home is a cement suburban dwelling built on well-drained hilly soils, some eight kilometers from the coast, and well outside flood zones. But many of my 3.4 million co-nationals in Puerto Rico have not been so lucky, and are experiencing, as I write this, catastrophic flooding. Further, tens of thousands have been without electricity since Hurricane Irma downed many of the distribution lines. In addition, there are more than 170,00 affected in the nearby US Virgin Islands and Dominica, Caribbean islands who have also experienced catastrophic damages.

Just in the largest suburban community in Puerto Rico—Levittown in the north—hundreds had to be evacuated on short notice during the early Thursday dawn as the gates of the Lago La Plata reservoir were opened and the alarm sirens failed to warn the population. The next day, a truly dramatic emergency evacuation operation followed as the Guajataca Dam in the northwest broke and 70,000 were urged to leave the area. At least ten have been confirmed dead so far.

The government of the Commonwealth has mounted a commendable response, but has been hampered in large part by the lack of power and communications facilities, which are inoperable at the moment except for those persons, agencies, and telephone companies that have power generators and the gas to keep them running. This has been one of the main impediments for Puerto Ricans abroad to communicate with loved ones and for the Rosselló administration’s efforts to establish communications and coordination with many towns that remain unaccounted for.

Chronic underinvestment and neglect of energy infrastructure increases human vulnerability to extreme weather

Why has Puerto Rico’s energy infrastructure been rendered so vulnerable in the recent weeks? The ferocity of Irma and María could stretch the capacity of even well-funded and maintained energy production and distribution systems. In Florida—where the power grid had received billions in upgrades over the last decade—Irma left two-thirds of the population without power (but was also able to bounce back after a few weeks).

But years of severe infrastructure underinvestment by the Puerto Rico Electric Power Authority (PREPA) has led to a fragile system that has completely collapsed after these two hurricanes. Irma’s indirect hit damaged distribution lines but not production; María’s eye made landfall on the southeast and exited through the central north, placing it right on the path of four of the high-capacity plants that burn heavy fuel and diesel oil. These plants are also located close to, or within, flood zones.

The reconstruction of the power infrastructure in Puerto Rico is a monumental task as it is critical to guarantee the well-being of Puerto Ricans. More than 3.4 million US citizens are now in a life-threatening situation and getting electricity up and running in the near term is critically important as it can support rescue and recovery efforts.

Wherever possible, these immediate efforts should aim to align with a broader rebuilding mission that points Puerto Rico toward a more economically robust and climate resilient future, not repairs that repeat the mistakes of the past. There is a need also to build resilience against the climate and extreme weather vulnerability Puerto Rico is so brutally facing right now.

There is a great need also for economic alleviation of the high cost of energy in Puerto Rico: electricity prices for all sectors (residential, commercial, and industrial) are much higher in Puerto Rico than in the United States. Reliance on imported fossil fuels for generation is one driver of the high cost: in 2016 nearly half of energy production came from petroleum, nearly one-third from natural gas, and 17 percent coal). Only 2 percent comes from renewables.

While there is quite a bit of clean energy momentum in the United States, that impetus is not being transferred to Puerto Rico. There are many reasons for that, including lack of support from PREPA. But Puerto Rico has strong solar and wind energy resource potential, and renewable energy has been proposed as a way to help PREPA pare down its $9 billion dollar debt, help reduce reliance on fossil fuels and fossil fuel price volatility, lower costs to consumers, and contribute to an economic recovery for the Commonwealth.

This unprecedented catastrophe affecting millions of US citizens requires the intervention of the federal government

To ensure a safe and just economic recovery for Puerto Rico, Congress and the administration need to commit resources to help the territory recover. President Trump has declared Puerto Rico a disaster zone, and FEMA director Brock Long will visit the island on Monday. The priority right now is to save lives and restore basic services. To aid these efforts, Congress and the Trump administration should:

  • Direct the Department of Defense to provide helicopters and other emergency and rescue resources to Puerto Rico.
  • Provide an emergency spending package to the US territory.
  • Increase the FEMA funding level for debris removal and emergency protective measures in Puerto Rico.
  • Temporarily suspend the Jones Act. The Jones Act, which mandates that all vessels carrying cargo into the US and its territories be US Merchant Marine vessels, significantly increases the cost of importing goods into the island.

Once the state of emergency ends, Governor Rosselló needs to be very vocal that Puerto Rico’s energy infrastructure reconstruction should help put the Puerto Rican people and economy on a path to prosperity and resilience from climate impacts. The 2017 hurricane season is not over yet, and the situation in Puerto Rico right now is catastrophic. Decisions about energy infrastructure will be made in the coming days, weeks, and months. Those decisions need to take into account the short- as well as the long-term needs of the Puerto Rican population and help make Puerto Rico more resilient to the massive climate and weather extreme dislocations that we are facing.

Want to help?

endi.com

Science Triumphs Over Disinformation in Initial Flame Retardant Victory

UCS Blog - The Equation (text only) -

In a stunning victory for consumer safety and a powerful display of the ability of independent science to spur policy change, the Consumer Product Safety Commission (CPSC) voted this week to ban a class of additive, polymeric organhalogen flame retardants (OFRs) that are present in many consumer products. Last week, I was one of many individuals who testified before the CPSC urging the body to grant a petition to ban the class of organohalogen flame retardants from four classes of consumer products: mattresses, children’s products, furniture, and electronic casings.

Of the 31 individuals who testified last week, there were only two individuals who advised the CPSC not to ban OFRs: representatives from the American Chemistry Council (ACC) and the Information Technology Industry Council. As Commissioner Marietta Robinson pointed out during the hearing, the only comments opposing the ban “represent those with a financial interest in continuing to have these potentially toxic, and some of them definitively, toxic, chemicals in our environment.”  She also noted that the presentations by those opposed to the petition were not transparent and used materials relating to chemicals that were irrelevant to the petition, a drastic contrast to the numerous scientists and scholars whose heavily footnoted statements provided evidence to support the arguments of the well-bounded petition.

Scientific information trumps corporate disinformation

Commissioner Robert Adler, who submitted the motion to grant the petition, compared the chemical industry’s talking points at the hearing on reasons not to ban OFRS to the tobacco industry’s same denial of the health impacts of smoking. His statement read, “if we took the tobacco industry’s word on cigarette safety, we would still be waiting. Similarly, we have waited for years for our friends the chemical industry to provide us with credible evidence that there are safe OFRS. I have little doubt that we will still be waiting for many years, to no avail.” Sadly, he’s probably right.

We have seen this trend time and time again. Whether it was the tobacco industry, the asbestos industry, the sugar industry, the PCB industry, the agrochemical industry, the pharmaceutical industry, and the oil and gas industry, corporate bad actors have known about risks of their products and have chosen not to act to protect the public for years, sometimes decades. Not only do they deny that there is harm, but they actively push for policies that allow them to conceal the truth for even longer. As Oxford University’s Henry Shue wrote about fossil fuel companies like Exxon in a recent Climatic Change article, noting that “companies knowingly violated the most basic principle of ‘do no harm.’” It is unethical and unacceptable that the public is not afforded the information we deserve on the harms of products we are exposed to every day in the air we breathe, the water we drink, the food we eat, and everything in between.

A 2008 EPA literature review on polybrominated diphenyl ethers, one type of OFR, found that 80 percent of total exposure to the chemical by the general population is through ingestion and absorption of house dust containing these chemicals. (Photo: Flickr/Tracy Ducasse)

Case in point: ACC’s statement after the CPSC’s vote included sticking to its talking points and pivoting from whether OFRs are safe to whether they reduce fire risk. During the hearing, the ACC representative argued that the petition was overly broad and that there was insufficient data on each OFR to ban them as a class. However, when asked by Commissioners for evidence that certain OFRs did not cause harm, he was unable to point to a specific chemical or cite relevant research. At a certain point, there is no place to pivot when the facts are stacked against you.

Dust is something I never gave much thought to growing up. If anything, “dusting” was always my favorite chore when faced with the options of vacuuming or washing the dishes. I never really gave much thought to what that elusive substance was composed of. I certainly wouldn’t have guessed that within those seemingly innocuous dust bunnies hiding behind bookshelves were a mix of chemicals that could impact my health. Dusting has taken on new meaning for me since conducting research on flame retardants.

For decades now, consumers have been left powerless and at the whim of manufacturers who have decided for us what chemicals go into our homes and end up in our dust.

The result? Most Americans have at least one type of flame retardant present in our blood, young children have higher levels than their mothers, and children of color and those from low income communities bear disproportionately high levels of these chemicals in addition to a host of other chemical burdens.

Shue writes,

To leave our descendants a livable world is not an act of kindness, generosity, or benevolence…it is merely the honoring of a basic general, negative responsibility not to allow our own pursuits to undercut the pre-conditions for decent societies in the future.

This ban is beyond due. Moving away from these chemicals toward safer alternatives is a win for all, this generation and next.

Product safety is not a political issue

During the vote, Commissioner Adler said that he holds strong to the belief that “product safety is not a partisan issue and should never politicized” after a statement from one of the two Republican Commissioners that granting this petition through a vote down party lines would turn the issue into a political football. Commissioner Robinson defended Adler, stating that she was “absolutely flummoxed” and had “absolutely no clue what the science of this petition and these flame retardants has to do with whether you’re a Democrat or Republican nor what it has to do with my term being potentially up.”  The granting of a petition rooted in rigorous science is not a political action. However, obstructing this science-based rulemaking process would be.

While the CPSC has voted to begin the process of rulemaking to ban OFRs under the Federal Hazardous Substance Act and to convene a Chronic Hazard Advisory Panel, the Commission will be shifting its composition as Marietta Robinson’s term ends in September. It is possible that this scientific issue could become politicized once President Trump nominates a Republican to join the CPSC and take back the majority. In fact, chairwoman Buerkle even suggested that the ban be overruled once the Republicans take back the majority. President Trump intends to nominate corporate lawyer, Dana Baiocco, who has defended companies that have faced charges regarding safety and misleading advertising of consumer and industrial products and medical devices.

We urge the Commission to continue the progress begun during yesterday’s vote to educate the public about the risks of OFRs and to create the policy that will ban these chemicals in consumer products for good. Let’s let science, not politics, have the final word. Our children will thank us someday.

 

 

Eric/Creative Commons (Flickr) Flickr/Tracy Ducasse

Puffins, Politics, and Joyful Doggedness in Maine

UCS Blog - The Equation (text only) -

Puffins were nearly extinct in Maine in the early 1900s, hunted for their eggs and meat. Their re-introduction to Eastern Egg Rock in Maine in the 1970s became the world's first successful restoration of a seabird to an island where humans killed it off. Photo: Derrick Jackson

Eastern Egg Rock, Maine — Under bejeweled blackness, the lacy string of the Milky Way was gloriously sliced by the International Space Station, the brightest object in the sky. Matthew Dickey, a 21-year-old wildlife and fisheries senior at Texas A&M, grabbed a powerful bird scope and was able to find the space station before it went over the horizon. He shouted: “I think I can make out the shape of the cylinder!”

The space station gone, Dickey and four other young bird researchers settled back down around a campfire fueled with wood from old bird blinds that had been blown out of their misery by a recent storm.

They were alone six miles out to sea on a treeless six-acre jumble of boulders and bramble.

44 years of Project Puffin

On this seemingly inconspicuous speck in Maine waters, a man once as young as they were, Steve Kress, began restoring puffins. He was part of the world’s first successful effort to restore a seabird to an island where they had been killed off by human activity. The experiment began in the spring of 1973 by bringing down 10-day-old chicks down from Newfoundland, feeding them until fledging size in the fall, and hoping that after two or three years out at sea, they would remember Maine and not Canada, where decades of management have maintained a population of about 500,000 pairs.

Tonight it was a celebratory fire, flickering off faces with crescent smiles. Besides Dickey, there was team supervisor Laura Brazier, a 26-year-old science and biology graduate of Loyola University in Maryland and masters degree graduate in wildlife conservation at the University of Dublin in Ireland. There was Alyssa Eby, 24, an environmental biology graduate of the University of Manitoba; Jessie Tutterow, 31, a biology graduate of Guilford College; and Alicia Aztorga-Ornelas, 29, a biology graduate from the Universidad Autonoma de Baja California, Mexico.

In the two days prior, their routine count of burrows with breeding pairs of puffins surpassed the all-time record. The previous mark was 150, set last year. During my four-night stay with them in late July, the count rose from 147 to 157. The summer would end with 173 pairs.

“We did it. We are awesome. You guys are awesome,” Brazier said. “Puffins are cool enough. To know we set a new record and we’re part of puffin history is incredible.”

As the fire roared on, celebration became contemplation. As full of themselves as they had a right to be, they know their record is fragile. Where once there were no more than four puffins left in Maine in 1902, decimated by coastal dwellers for eggs and meat, Kress and 600 interns in the 44 years of Project Puffin have nursed the numbers back to 1,300 pairs on three islands. The techniques used in the project—including the translocation of chicks and the use of decoys, mirrors, and broadcast bird sounds to make birds think they had company—have helped save about 50 other species of birds from Maine to Japan and China. (I have the distinct pleasure of being Kress’s co-author on the story of his quest, “Project Puffin: The Improbable Quest to Bring a Beloved Seabird Back to Egg Rock,” published in 2015 by Yale University Press.)

Interns (Left to right) Alyssa Eby, Matthew Dickey, Alicia Aztorga-Ornelas, and Eastern Egg Rock Supervisor Laura Brazier hold an adult puffin they banded. Also on the team but not pictured is Jessie Tutterow.

In the crosshairs of American politics

But in the last decade, the Atlantic puffin, which breeds in an arc up from Maine and Canada over to Iceland, Scandinavia, and the United Kingdom, has become a signal species of fisheries management and climate change.

On the positive side, Maine puffins are bringing native fish to their chicks that rebounded with strict US federal rules, such as haddock and Acadian redfish. Negatively, the last decade has also brought the warmest waters ever recorded in the Gulf of Maine. A study published in April by researchers from the National Oceanic and Atmospheric Administration (NOAA) predicts that several current key species of fish “may not remain in these waters under continued warming.” Last month, researchers from the University of Maine, the Gulf of Maine Research Institute, NOAA, and others published a study in the journal Elementa, finding that the longer summers in the Gulf of Maine may have major implications for everything from marine life below the surface to fueling hurricanes in the sky.

For puffins, there already is significant evidence that in the warmest years, the puffin’s preferred cold-water prey like herring and hake are forced farther out to sea while some of the fish that come up from the mid-Atlantic, such as butterfish, are too big and oval for small puffin chicks to eat. The new fish volatility is such that while puffins thrived last year on tiny Eastern Egg Rock, their counterparts could not find fish off the biggest puffin island in the Gulf of Maine, Canadian-administered Machias Seal Island. Last year saw a record-low near-total breeding failure among its 5,500 pairs of puffins.

The Atlantic puffin, from Maine to the United Kingdom, has rapidly become a signal bird for climate change via the fish the parents attempt to bring to chicks. The Gulf of Maine is one of the fastest warming waters in the world and as a result, more puffins are bringing in more southerly species such as butterfish, such as the one pictured here. Butterfish are too large and oval for chicks to eat, leading to starvation. Photo: Derrick Jackson

In the European part of the Atlantic puffin’s range, warmer water displacing prey, overfishing, and pollution have hammered breeding success. According to an article this year in the journal Conservation Letters, co-authored by Andy Rosenberg, the director for the Center for Science and Democracy at the Union of Concerned Scientists and a former regional fisheries director for the National Oceanographic and Atmospheric Administration, the north Atlantic realm of the puffin is one of the most over-exploited fisheries in the world, as evident by the crash of several fisheries, most notably cod.

On the Norwegian island of Rost for instance, the 1.5 million breeding pairs of puffins of four decades ago were down to 289,000 in 2015. A key reason appears to be voracious mackerel moving northward, gobbling up the puffin’s herring. Even though there are an estimated 9.5 million to 11.6 million puffins on the other side of the Atlantic for now, Bird Life International two years ago raised the extinction threat for puffins from “least concern” to “vulnerable.”

Much of that was on the minds of the Egg Rock interns, because the very puffins they were counting are in the crosshairs of American politics.

Incessant attacks on environmental accomplishments

Puffins are on land only four months to breed so Kress and his team a few years ago put geo-locators on some birds to see where they migrate in the eight months at sea. Two years ago, the team announced that in the fall and early winter, many Maine puffins go north to the mouth of the St. Lawrence River. In late winter and early spring, they come south to forage in fish-rich deep water far south of Cape Cod. That area of ocean is so relatively untouched by human plunder, the corals in the deep are as colorful as any in a Caribbean reef.

The Obama administration was impressed enough to designate the area as the Northeast Canyons and Seamounts National Marine Monument, protected from commercial exploitation. While vast areas of the Pacific Ocean under US jurisdiction earned monument status under Presidents Obama and George W. Bush, the canyons are the first US waters in the Atlantic to be so protected.

Yet President Trump, as part of his incessant attack on his predecessor’s environmental accomplishments, ordered Interior Secretary Ryan Zinke to review Obama’s monument designations for possible reversal. Even though the Coral Canyons account for a tiny fraction of New England’s heavily depleted waters, the fishing lobby bitterly opposed monument status. This week, the Washington Post reported that Zinke has recommended that the Canyons and Seamounts be opened to commercial fishing.

The researchers on Egg Rock mused around the fire over the concerted attempt, led by the Republican Party and often aided by Democrats in top fossil-fuel production states, to roll back environmental protections for everything from coral to coal ash and broadly discredit science in everything from seabird protections to renewable energy. Some of the divisions of NOAA that are directly involved in studying waters like the Gulf of Maine are targeted for massive budget cuts by the Trump administration.

Maine’s puffins are direct beneficiaries of strict federal fishing management since the 1970s. In recent years, puffins have supplemented their traditional diet of herring and hake with species that have rebounded in the Gulf of Maine, such as the haddock pictured here. Photo: Derrick Jackson

Fighting against a stacked deck

“It’s funny how in the business world and the stock market, no one questions the numbers and facts,” said Brazier, who marched in April’s March for Science in Washington, DC. “They’re taken as facts and then people use them to decide what to do. But now it’s ok to question science.”

“I think it’s because if you can deny science, you can deny what needs to be done,” Eby said. “It’s too hard for a lot of people in rich countries to get their heads around the fact is that if we’re going to deal with climate change, we’re going to have to change the way we live and the way we use energy. That’s so hard, a lot of people would rather find ways to skip the science and live in their world without thinking about the consequences.”

Tutterow, who hails from North Carolina, where the General Assembly in 2012 famously banned state use of a 100-year-projection of a 39-inch sea-level rise, added, “If I was offered a state or federal job, I’d take it. I’d like to believe there’s a lot of career professionals who work hard to get the job done. But it used to be the main thing you worried about was red tape. Now you have to worry about censorship.”

Dickey said simply, “Sometimes it feels like the deck is stacked against us. But we just have to keep working as hard as we can until someone realizes we’re just trying to deliver facts to help the world.”

Puffins in Maine breed in burrows that wind crazily underneath boulders that rim their islands. That tests the ability of interns to reach for chicks to band for future study. Photo: Derrick Jackson

Joyful doggedness

The stacked deck is unfair, given the joyful doggedness displayed by this crew. On two days, I followed them around the perimeters of Egg Rock as they wrenched their bodies to “grub” under the boulders, contorting until they could reach their arm into the darkness to puffin chicks to band for research.

The simple act of banding has led to understanding the puffin’s extremely high levels of fidelity, coming back to the same island and burrow year after year despite migrating hundreds of miles away. One Project Puffin bird was in the running for the oldest-known puffin in the world, making it to 35 before disappearing in 2013. A Norwegian puffin made it 41 before being found dead.

On other Atlantic puffin islands, the birds can nest in more shallow cavities of rocks and mounds in grassy cliffs within wrist and elbow reach. Researchers on those islands are able to band scores of puffin chicks and adults.

But the massive size of jagged boulders on Eastern Egg Rock makes it so difficult to grub that the summer record was only 14. On my visit, the crew went from 9 to 17 chicks, with Brazier constantly saying, “Oh no, we’re not giving up. We got this. The next crew’s going to have work hard to beat us.”

No face was brighter than Aztorga-Ornelas’ when she took an adult puffin they banded and lowered it between her legs like a basketball player making an underhanded free throw. She lifted up the bird and let it go to fly back to the ocean to get more fish for its chicks. “I’ll never forget that for the rest of my life,” she said.

On another day, with the same enthusiasm displayed for puffins, they grubbed for another member of the auk family, the black guillemot. At one point, they caught four chicks in separate burrows within seconds of each other. They gleefully posed with birds for photographs.

“I wish people could feel why I’m in this,” Tutterow said. She talked about a prior wolf study project in Minnesota. “We tracked in the snow what we thought was one wolf,” she said. “Then, at a junction, what we thought was one single wolf, the tracks split into five different sets of tracks. Your jaw drops at the ability of these animals to perfectly follow each other to disguise the pack.”

Eastern Egg Rock went from the 1880s to 1977 with no resident puffins. This year, the number of breeding pairs hit a record 173. Where there were two or four birds left in the entire state of Maine in 1902, there are 1,300 pair today. Photo: Derrick Jackson

Getting it right

My jaw dropped at how bird science is making world travelers out of this crew beyond Egg Rock. Brazier has worked with African penguins in South Africa, loggerhead turtles in Greece, snowshoe hares in the Yukon, and this fall is headed to Midway Atoll for habitat restoration in key grounds for albatross.

Eby has worked with foxes in Churchill, Manitoba; oystercatchers, murres, auklets, gulls, and petrels in Alaska; and ducks in Nebraska. Besides wolves, Tutterow has helped manage tropicbirds and shearwaters in the Bahamas, honeybees and freshwater fish in North Carolina, loons in the Adirondacks, and wolves in Minnesota. Aztorga-Ornelas has worked with oystercatchers and auklets on Mexican islands and Dickey has helped restore bobwhite, quail, deer, and wild turkey habitat in Texas.

Brazier said a huge reason she helped rehabilitate injured endangered African penguins in South Africa was because of her experience tending to them in college at the Maryland Zoo. “I actually didn’t get the section of the zoo I applied for,” she said. “I got the African penguin exhibit and when all these little fellas were running around my feet, it was the best day of my life.”

Though he is the youngest of the crew, Dickey said his quail and bobwhite work gave him self-sufficiency beyond his years. “My boss lived two miles away and my tractor had a flat four times. It was on me to fix it and I figured it out, even though it was hotter than hell every day, sometimes 110.”

Tutterow, the oldest, originally earned a bachelors degree in nursing at Appalachian State University, but found far more satisfaction in an outdoor career. Among her fondest childhood memories was her parents allowing her to wander in local woods to read spot on a rock on a creek. “You can build a lifestyle around any amount of income, but you cannot build happiness into every lifestyle,” she said. “Working with these animals, I’m building happiness for them and me.”

No myopic set of politics and denial of science should ever get in the way of this level of career happiness. Aztorga-Ornelas and I, despite her limited English and my stunted Spanish, shared a dozen “Wows!” sitting together in a bird blind, watching puffins zoom ashore with fish.

Eby said, “It’s strange for me. We just came out of a conservative government in Canada (under former Prime Minister Stephen Harper) where they stopped lake research for acid rain, fisheries, and climate change and government scientists did not feel the freedom to speak out. And now that we’re getting more freedom, I’m here. I hope the US can get it right soon.”

 

What’s My State Doing About Solar and Wind? New Rainbow Graphic Lets You Know

UCS Blog - The Equation (text only) -

[With costs dropping and scale climbing, wind and solar have been going great guns in recent years. Shannon Wojcik, one of the Stanford University Schneider Sustainable Energy Fellows we’ve been lucky enough to have had with us this summer, worked to capture that movement for your state and its 49 partners. Here’s Shannon’s graphic, and her thoughts about it.]

Do you ever wonder how much energy those rooftop solar panels in your state are contributing to renewable energy in our country? How about the wind turbines you see off the highway?

Our new “rainbow mountain” graphic lets you see your state’s piece of solar and wind’s quickly growing contribution to the US electricity mix. It shows how much of our electricity has come from wind and solar each month for the last 16 years. Just click on your state in the graph’s legend or roll your mouse over the graphic to see what’s been happening where you live.

Dashboard 1

var divElement = document.getElementById('viz1506006933250'); var vizElement = divElement.getElementsByTagName('object')[0]; vizElement.style.width='870px';vizElement.style.height='619px'; var scriptElement = document.createElement('script'); scriptElement.src = 'https://public.tableau.com/javascripts/api/viz_v1.js'; vizElement.parentNode.insertBefore(scriptElement, vizElement);

At first glance, this graphic looks like a disorderly rainbow mountain range. Keep staring though (try not to be mesmerized by the colors) and you can start to see patterns.

The peaks in the mountain range seem to be methodical, as well as the dips. The peaks where the most electricity is supplied by wind and solar can be seen in spring, where demand (the denominator) is lower due to moderate temperatures, and generation (the numerator) is high due to windy and sunny days. The crevasses, in July and August, happen because demand for electricity is high at those times thanks to air conditioning, increasing the overall load on the US grid—and driving up our calculation’s denominator. If you were to look just at monthly generation of wind and solar, this variation would be smaller.

Another, much more obvious thing about the mountains is that they’re getting taller. In fact, we passed a notable milestone in March of 2017, when, for the first time, wind and solar supplied 10% of the entire US electricity demand over the month. In 2012, solar and wind had only reached 4.6% of total US generation, so the recent peak meant more than a doubling in just 5 years.

That’s momentum.

Climbers and crawlers

Being able to see the different states lets you see where the action is on wind and solar—which are the climbers and which are the crawlers.

You know the saying about how everything is bigger in Texas? Well, that certainly holds true here. Texas is the bedrock of this mountain range, never supplying less than 14% of the wind and solar for the entire US after 2001. Even supplying as much as 35% some months. Texas hosts the largest wind generation, and doesn’t seem to be in danger of losing that title anytime soon.

California is another crucial state in this mountain range, and has been from the beginning. California was building solar and wind farms years before the other states, a trendsetter; in 2001, it was supplying up to 75% of all the wind and solar electricity in the US. California is still the second largest supplier of wind and solar.

Other notable states that are building this solar and wind mountain are Oklahoma, Iowa, Kansas, Illinois, Minnesota, Colorado, North Dakota, Arizona, and North Carolina. Most of these states are rising up due to wind, but Arizona and North Carolina, along with California, are leading with solar.

Not all states with strong solar and wind performances by some metrics show up here. South Dakota is #2 for wind as a fraction of their own generation, though on this graphic it’s barely visible.

What does this mean?

This graphic shows that the momentum of solar and wind growth in the United States is undeniable. It can be seen on rooftops, in windy valleys and on windy plains, and even in states where coal has been king. All 50 states are involved as well, as every state generates electricity with wind and solar.

There are many ways for your state to increase its overall percentage. It can either decrease its denominator with energy efficiency or increase its numerator with wind and solar installations.

Not satisfied with where your state shows up on this graph? Check out what more your state can do.

Free Lunches in New York City Public Schools Are a Win for Kids—and Technology

UCS Blog - The Equation (text only) -

Photo: USDA

It’s so good to share good news.

This month, the New York City Public Schools announced that, starting with the current school year, all students can receive free lunch with no questions asked. That means less stigma for kids facing food insecurity, less worrying for families, and less paperwork for school districts. And it might surprise you to learn that at the heart of this victory—carried across the finish line by a group of dedicated advocates—is a fairly common application of technology.

The underlying policy at play here is called the “Community Eligibility Provision,” or CEP. It was authorized with the Healthy, Hunger-Free Kids Act of 2010 to help schools and local educational agencies with a high percentage of low-income students. As a colleague wrote on this blog in 2016, CEP helps school systems (like New York City Public Schools) to reduce paperwork and poverty stigma, while making sure that free and reduced price meals are available to all kids who might need them. Instead of asking each family to fill out an application, CEP allows schools to determine student eligibility through household participation in programs like SNAP (the Supplemental Nutrition Assistance Program, commonly referred to as food stamps) and TANF (the Temporary Assistance for needy Families program). If over 40 percent of students are deemed eligible, schools receive additional federal reimbursement dollars to cover free meals for more students beyond those who qualify—ensuring that even those whose families are not enrolled in federal assistance programs can still get meals if they need them. 

So how is New York City able to cover free meals for all students?

Here’s the math answer: the CEP multiplier is 1.6, which means that if 50 percent of students at School X are eligible for free meals, School X can actually serve free meals to (50 percent) * (1.6) = 80 percent of students using federal reimbursement dollars. If New York City Public Schools are now receiving federal reimbursement for 100 percent of students, it would mean they have demonstrated that at least (100 percent) / (1.6) = 62.5 percent of students are eligible through CEP.

Which brings us to the real-world answer: New York is able to cover free meals for all students because it got smart about its use of technology to better reflect true student need. The New York Department of Education website describes the new data matching engine it has developed to identify eligible students:

“This new matching system provides a more efficient and accurate process for matching students across a range of forms that families already complete. This new matching process yielded an increase in the number of students directly certified – or matched to another government program – and increased the direct certification rate, allowing the City to qualify for the highest level of reimbursement in the federal CEP program. The number of families living in poverty has not increased; the changes to the matching process allow the City to better identify families.”

Why the technology matters

I know what you’re thinking. It’s awesome that all kids in New York City Public Schools can eat for free! But why make such a big deal about this technology? It doesn’t seem like rocket science.

Bingo.

New York City Public Schools is not using a particle accelerator to improve data matching among students. They haven’t even used a 3-D printer. The data integration and management systems they’re employing, while complex, are actually fairly commonplace. It’s the same sort of technology banks use to combine different databases of credit scores and application information to make credit offers, which is the same technology Netflix uses to deduce that because you watched Good Burger, you might like Cool Runnings. (Hypothetically speaking.)

Yet when it comes to the use of technology in the administration of nutrition assistance programs, we have fallen remarkably behind. The transition from actual paper food stamps to electronic benefit cards officially concluded in 2004, nearly fifty years after the introduction of the first major credit card. Even now, some states (looking at you, Wyoming!) require SNAP applications to be faxed, mailed, or returned in person.

To be clear, I’m not claiming technology is a silver bullet. For one, implementing new technology often comes with a price tag—and a steep learning curve. (Just ask Kentucky.) In particular, the use of data matching raises ethical concerns related to privacy and security, and these are not to be overlooked. But in many cases, these are arguments to improve, rather than disregard, the technology and the policies that guide its use. Because when our public assistance programs fall behind, so do the people who rely on them, and so does our ability to deliver maximum public benefit with increasingly limited resources. It is critical (and just plain sensible) to use the tools at our disposal to help realize the potential of current technological systems to enhance the strength and efficiency of the federal safety net. 

Carrying the momentum in the 2018 farm bill

Keep an eye on this issue. There is reason to suspect that the advancement of technology in public assistance programs will be addressed in the 2018 farm bill, and even reason to hope for a bipartisan effort. In fact, I’ll take the opportunity to quote Glenn Thompson, chairman of the House Agriculture Nutrition Subcommittee, who opened a June hearing on SNAP technology and modernization with this sentiment: “We need to get the policy right. As we approach the upcoming farm bill, it is critical we understand opportunities to amend and improve the program to properly account for the changes that come with our evolving, technological world.”

Bringing Down the House: A Hostile Takeover of Science-Based Policymaking by Trump Appointees

UCS Blog - The Equation (text only) -

The Trump administration is slowly filling positions below the cabinet officer level in the “mission agencies” of the federal government (e.g., EPA, NOAA , Interior, DOE, etc. whose job it is to implement a specific set of statutory mandates). The appointed individuals are leading day-to-day decision-making on policies from public health and safety to environmental protection to critical applied science programs. In other words, the decisions these appointees make will affect everyone in the country.

The job of the agencies and their political leadership is to represent the public interest. It is not to serve the private interests of particular industries and companies, or even to push political viewpoints, but to implement legislative mandates in the interest of the American public. After all, who else but government can do this? Our laws call for the water and air to be clean, our workers and communities to be safe, our environment to be healthy and our science to be robust and fundamental to better policy and decision-making. That is what mission agencies are tasked to do.

So, what have we seen so far? To be sure, the Administration has nominated and appointed some qualified individuals with good experience and little apparent conflicts of interest. But unfortunately, that is not the norm. In my mind, most of the key appointments with responsibility for science-based policymaking fall into three categories:

  • The conflicted: Individuals who have spent a significant part of their careers lobbying the agencies they are now appointed to lead to obtain more favorable policies to benefit specific industries or companies—and who will likely do so again once they leave the government. These individuals have a conflict of interest because of these connections. Despite President Trump’s call to “drain the swamp,” these appointees are well-adapted and key species in that very swamp (sorry, my ecologist background showing through).
  • The opposed: Individuals who have spent much of their careers arguing against the very mission of the agencies they now lead. This group is not entirely separate from the first, because often they made those arguments on behalf of corporate clients pushing for less accountability to or oversight from the American public. But further, they have opposed the very role played by the federal agencies they are appointed to serve. While they may have conflicts of interest as in (1), they also have an expressed anti-agency agenda that strongly suggests they will work to undermine the agency’s mission.
  • The unqualified: Individuals who are wholly unqualified because they haven’t the experience or training or credentials that are requisite for the job. Again, these appointees may also have conflicts of interest, and apposite political agendas to the missions of the agencies, but they also have no real place leading a complex organization that requires specific expertise.

With more than 4,000 possible political appointments to federal agencies, I of course cannot cover them all. In fact, scanning through the list of those 600 appointments requiring Senate confirmation, less than one-third have even been nominated for Senate action. But here is a disturbing set of nominees or appointments that undermine science-based policymaking.

The conflicted

William Wehrum is a lawyer and lobbyist nominated to lead the EPA Office of Air and Radiation (OAR). He previously worked at EPA during the G.W. Bush Administration. UCS opposed his nomination then. Mr. Wehrum’s corporate clients include Koch Industries, the American Fuel and Petrochemical Manufacturers, and others in the auto and petrochemical industries. He has been a vocal spokesperson against addressing climate change under the Clean Air Act, which would be part of his responsibility as OAR director. While he has advocated for devolving more authority to the states for addressing air pollution generally, he also opposed granting California a waiver under the Clean Air Act to regulate greenhouse gas emissions from vehicles. Mr. Wehrum has also been directly involved, both as a lobbyist for industry and during his previous stint at EPA, in efforts to subvert the science concerning mercury pollution from power plants, restrictions on industrial emissions, as well as lead, soot and regional haze regulations.

Dr. Michael Dourson has been nominated to be EPA Assistant Administrator for Chemical Safety and Pollution Prevention. He is well known by the chemical industry, having spent years working as a toxicologist for hire for industries from tobacco to pesticides and other chemicals. Dr. Dourson has argued that the pesticide chlorpyrifos is safe despite a large body of science to the contrary. He has advocated for the continued use of a toxic industrial chemical called TCE, which the EPA determined was carcinogenic to humans by all routes of exposure. [TCE was the chemical linked to leukemia in children in the 1998 film “A Civil Action.”] When asked about his controversial chemical risk assessment company, TERA, receiving funding from chemical companies, Dourson responded: “Jesus hung out with prostitutes and tax collectors. He had dinner with them.”

Dr. Nancy Beck, appointed to the position of EPA Deputy Assistant Administrator, now leads the agency’s effort to implement the Lautenberg Chemical Safety Act, which was signed into law last year. Dr. Beck was previously senior staff with the American Chemistry Council, the trade organization that worked very hard for years to weaken the rules protecting the public from toxic chemicals. The result? The new rules from the EPA are far weaker than those developed by the professional staff at the agency and remarkably similar to the position the industry favored, while dismissing the positions of other members of the public and other organizations including UCS. Previously, Dr. Beck worked in the G.W. Bush Administration at the Office of Management and Budget. During that part of her career Dr. Beck was called out by the U.S. House Science and Technology Committee for attempting to undermine EPA’s assessment of toxic chemicals and her draft guidance on chemical safety evaluations was called “fundamentally flawed” by the National Academy of Sciences.

Lest you think that the conflicted are all at EPA, consider David Zatezalo, nominated to be Assistant Secretary of Labor for Mine Health and Safety. He was formerly the chairman of Rhino Resources, a Kentucky coal company that was recipient of two letters from the Mine Safety and Health Administration for patterns of violations. Subsequently a miner was killed when a wall collapsed. The company was fined.

David Bernhardt has been confirmed as the Deputy Secretary of Interior. He was DOI Solicitor under the George W. Bush administration. In 2008, weeks before leaving office, Bernhardt shifted controversial political appointees who had ignored or suppressed science into senior civil service posts. While at his law firm Brownstein Hyatt Farber Schreck, he represented energy and mining interests and lobbied for California’s Westlands Water District. His position in the firm—he was a partner—and the firm’s financial relationship with Cadiz Inc. (which is involved in a controversial plan to pump groundwater in the Mojave desert and sell it in southern California) has led to one group calling him a “walking conflict of interest.” Bernhardt also represented Alaska in its failed 2014 suit to force the Interior department to allow exploratory drilling at the Arctic National Wildlife Refuge.

The opposed

Susan Combs has been nominated to be the Assistant Secretary of Interior for Policy, Management, and Budget. She was previously Texas’s agricultural commissioner and then the state’s Comptroller where she often fought with the U.S. Fish and Wildlife Service over Endangered Species Act issues. Notably she has a history of meddling in science-based policy issues like species protections. She has been deeply engaged in battling for property rights and against public interest protections; she once called proposed Endangered Species Act listings as “incoming Scud missiles” against the Texas economy. Of course, protecting endangered species, biodiversity and public lands is a major responsibility of the Department of Interior.

Daniel Simmons has been nominated to be the Principal Deputy Assistant Secretary of the Office of Energy Efficiency to foster development of renewable and energy-efficient technologies. He was previously Vice President at the Institute for Energy Research, a conservative organization that promotes fossil fuel use, opposed the Paris Climate Accord, and opposes support for renewable energy sources such as wind and solar. He also worked for the American Legislative Exchange Council (ALEC) as director for their natural resources task force. ALEC is widely known for advocating against energy efficiency measures.

The unqualified

Sam Clovis, the nominee for Undersecretary of Agriculture for Research, Education and Economics, effectively the department’s chief scientists, is not a scientist or an economist nor does he have expertise in any scientific discipline relevant to his proposed position at USDA—like food science, nutrition, weed science, agronomy, entomology. Despite this lack of qualifications, he does deny the evidence of a changing climate. He was a talk radio host with a horrendous record of racist, homophobic and other bigoted views which should be disqualifying in themselves.

Albert Kelly has been appointed a senior advisor to EPA Administrator Scott Pruitt and the Chair of the Superfund Task Force. He is an Oklahoma banker with no experience with Superfund or environmental issues, but he was a major donor to Mr. Pruitt’s political campaigns. So far the task force has focused on “increasing efficiencies” in the Superfund program.

Over at NASA, the nominee for Administrator is Rep. James Bridenstine, (R. OK). While he certainly has government and public policy experience (a plus), he does not have a science background, a management background or experience with the space program. He has called aggressively for NASA to focus on space exploration and returning to the moon, rather than its earth science mission. In addition, he has been a strong advocate for privatization of some of the work of the agency. He has questioned the science on climate change and accused the Obama Administration of “gross misallocation of funds” for spending on climate research.

Michael Kratsios is the Deputy Chief Technology Officer and de facto head of Office of Science and Technology Policy in the White House. He is a former aide to Silicon Valley executive Peter Thiel and holds a AB in politics from Princeton with a focus on Hellenic Studies. He previously worked in investment banking and with a hedge fund. How this experience qualifies him to be deputy chief technology officer is beyond me.

Can we have science-based policies?

This is by no means a full list of egregious nominees for positions that will have a big impact on our daily lives. So, the question remains, is science-based policy making a thing of the past? Will the conflicted, the opposed, and the unqualified be the pattern for the future?

Fortunately, we can and should fight back. We as scientists, concerned members of the public, and activists can call on our elected officials to oppose these nominees. If they are in place, then they can be held to account by Congress, the courts, and yes, in the court of public opinion. Handing over the fundamental job of protecting the public to champions for regulated industries and political ideologues is wrong for all of us. After all, if industry did protect the public from public health or environmental impacts, then regulatory controls would be superfluous.

We can’t just wring our hands and wish things didn’t go this way. Conflicted, opposed and unqualified they may be, but they are now in public service. Let’s hold them to account.

How Freight Impacts Communities Across California

UCS Blog - The Equation (text only) -

Photo: Luis Castilla

Today, UCS and the California Cleaner Freight Coalition (CCFC) released a video highlighting the impacts of freight across California. This video – and longer cuts of individual interviews here – touch on the many communities across California affected by freight.

Freight is a big industry in California. Nearly 40 percent of cargo containers entering and leaving the United States pass through California ports. California is also the largest agricultural producing state, supplying nearly one fifth the country’s dairy, one third of the country’s vegetables, and two-thirds of the country’s fruits and nuts.

Truck traffic on I-5 heading north towards the Central Valley near Castaic, CA.

Farm in Shafter, CA.

This means California is home to many ports, rail yards, warehouses, distribution centers, farms, and dairies – all of which are serviced by many trucks. Despite the latest (2010) engine standards and significant financial investments by the state and local air districts, air quality in California remains among the worst in the United States, due in large part to truck emissions.

The most polluted cities in the United States. Source: American Lung Association, State of the Air 2016.

Communities impacted by freight are often burdened by other sources of pollution

In the Central Valley, a trash incinerator is opposed by community groups yet classified by the state as a source of renewable energy. Biomass power plants emit significant amounts of particulate matter. Oil drilling operations contribute to both air pollution and unknown water contamination.

Dairies in the Valley contribute not only to methane emissions, but also to other health hazards including particulate matter (from reactions of ammonia in excrement with nitrogen oxides (NOx) from cars and trucks), smog/ozone (from reactions of NOx with volatile organic compounds produced by decomposing animal feed), and contamination of aquifers. Just like real estate prices drove dairies from the Inland Empire to the Central Valley, warehouses and distribution centers are following suit despite being 150 miles from the Ports of Los Angeles and Long Beach.

Silage (animal feed) pile near Shafter, CA.

Two views of a large Ross Distribution Center in Shafter, CA (measures over 1 mile around the building and 2 miles around the entire lot).

In the Los Angeles region, not only are roadways and the two ports major concerns for communities, but so are oil refineries and over 1,000 active oil drilling sites.

Most of these urban oil sites are within a few football fields of homes, schools, churches, and hospitals. Despite all of the “green” accolades bestowed on California, it is the 3rd largest oil producer in the United States after Texas and North Dakota.

Pumpjacks in California can be found next to farms, hospitals, and even In-N-Out.

So what’s the solution?

For trucks, we need stronger engine standards for combustion vehicles, commitments to and incentives for zero-emission vehicles, and roll-out of battery charging stations and hydrogen fueling stations with electricity and hydrogen from renewable energy.

Just last week, the California legislature passed bills (1) to get zero-emission trucks integrated to fleets owned by the state and (2) allocating $895 million from cap and trade revenue for cleaner heavy-duty vehicles. The California Cleaner Freight Coalition is working on a range of solutions from the state to local level and UCS is proud to be a member of this coalition. Watch and share the video!

Photo: Luis Castilla Photo: Jimmy O'Dea Photo: Jimmy O'Dea Photo: Jimmy O'Dea Photos: Jimmy O'Dea Photos: Jimmy O'Dea

Tax Credits and Rebates for Electric Cars Benefit US Drivers and Automakers

UCS Blog - The Equation (text only) -

Leadership on vehicle electrification is critical to tackling climate change, protecting consumers from volatile oil prices, maintaining the competitiveness of US automakers, and creating 21st century manufacturing jobs. However, electric vehicles (EVs) currently cost more to manufacture than comparably sized gasoline-powered vehicles, which can mean higher prices and slower adoption.  One important policy solution to help accelerate the rate of EV sales is to offer purchase incentives to potential EV buyers, as discussed in a new policy brief “Accelerating U.S. Leadership in Electric Vehicles” that I co-authored with my UCS colleague Josh Goldman.

Incentives, such as tax credits and rebates, encourage EV sales while automakers scale up manufacturing and technology improves. Much of the additional cost of making an EV is due to the battery, and this scale up of EV manufacturing, along with improved and novel battery technology, will reduce the cost of manufacturing EV batteries and make EVs more cost competitive.

Modern EVs have only been offered for seven years, yet during that time we have seen impressive reductions in the cost to produce automotive battery packs. Initially, costs of EV battery packs were estimated to cost over $750/kWh of storage capacity. Now battery costs have fallen to around $200/kWh, with further reductions predicted by industry analysts. Once battery costs reach the range of $125-$150/kWh, the costs of EVs are projected to reach parity with conventional vehicles.

As battery costs continue to decline the cost difference between EVs and conventional gasoline vehicles will fall, although the exact date at which EVs achieve cost parity ($125-150 per kWh) depends on the rate of EV sales and other factors. References for data sources available online.

It may make sense to reduce broadly-available incentives after EVs become more price competitive, but removing them too soon would stall U.S. leadership in a critical technology.

The US federal income tax credit, in particular, is a vital investment in the transition to electric vehicles. The credit provides a credit of up to $7,500 per EV, based on the size of the battery. Most battery-electric and long-range plug in hybrids qualify for the full credit value. However, this credit begins to phase out for a manufacturer once they sell 200,000 electric vehicles in the US.

Market leaders General Motors, Nissan, and Tesla are already over 100,000 cumulative EV sales as of mid-2017. General Motors and Tesla will likely hit the phase out first, probably before the end of 2018, especially if their new more affordable long-range EVs (Chevy Bolt EV and Tesla Model 3) sell well. Therefore, this phase out has the perverse effect of penalizing some of the leaders in EVs and notably EVs that are coming off assembly lines in the US (including all Tesla, General Motors, and Nissan EVs sold in the US), while other manufacturers like Honda would have incentives available for years to come.

The federal EV income tax credit phases out for a manufacturer’s EV models once they exceed 200,000 sales. General Motors and Tesla are on pace to hit the sales cap within 18 months, and Nissan is not far behind.

State incentives are also important to accelerate the switch from gasoline to electricity for our driving. The largest program, California’s Clean Vehicle Rebate Project, has helped over 200,000 buyers make the change to electric drive. And other states have also stepped up to support the transition to cleaner cars. For example, Josh Goldman blogged recently about Oregon’s newly enacted EV rebate program.

Increasingly, we are seeing studies that predict sales of EVs will overtake gasoline cars in the next 10-20 years. However, it is still important to support the nascent EV industry, both to increase the number of EVs on the road now and to support the US automakers that are leading this vital transition.

Purchase incentives for plug-in EVs have been a critical policy tool, accelerating the manufacture and adoption of EVs and making them accessible to car buyers. These investments in EV technologies are helping automakers transition to new technologies and enabling Americans to drive cleaner and cheaper.

In particular, the federal EV tax credit is essential. Not only is it important for US drivers, but it also for US manufacturers. With a number of countries announcing bold EV efforts (such as France, China, and India), letting the tax credit expire for leading US EV manufacturers could be a costly mistake.

Now is not the time to end a policy that works. Instead, the federal government should extend the credit to ensure continued progress, build upon success, and keep the United States in the lead with 21st century automotive technology.

Will Republican Mayors Crack the Party’s Wall of Climate Denial?

UCS Blog - The Equation (text only) -

Hurricane irma landfallHurricane Irma approaches landfall in southern Florida, September 10, 2017. Photo: NASA

If any set of Republicans cracks the party’s wall of denial on climate change, it will be those most responsible to deal with its ravages—mayors. That was reaffirmed during Hurricane Irma when Miami’s Republican mayor, Tomas Regalado, told the Miami Herald:

“This is the time to talk about climate change. This is the time that the president and the EPA and whoever makes decisions needs to talk about climate change. If this isn’t climate change, I don’t know what is. This is a truly, truly poster child for what is to come.”

Irma offered Regalado a crescendo for a message he and other Republican mayors in Florida have repeatedly sent to party leaders. During the 2016 Republican presidential primaries, he and fellow Republican James Cason, who this year retired as mayor of Coral Gables, penned an op-ed in the Miami Herald challenging Republican candidates to understand that the rising seas of climate change will make South Florida “unrecognizable.”

Describing themselves as “staunch” Republicans who otherwise are suspicious of federal regulations, Regalado and Cason wrote that “we shouldn’t waste time” debating the science. Rather, they said it was time to debate how to respond.

“We can debate ways to develop clean energy, how to put a price on carbon and how to protect coastline communities from flooding and storms,” they wrote. “We can debate ways to grow the economy and create new jobs while protecting lives and property from climate change.”

In an interview last year with National Public Radio, Cason said climate change affects everything from low-lying schools and hospitals to the owners of $5 million homes who “see their property values go down because they can no longer get a boat out. When they start flooding, whenever that is, when do they stop paying taxes?”

Republican mayors do not have the luxury to fly at the 30,000-foot level of denial as Florida Governor Rick Scott is accused of banning “climate change” from official state communications, as Environmental Protection Agency Administrator Scott Pruitt claims it is “insensitive” to connect climate change to Hurricane Irma, and as Arizona Senator John McCain made news by being a lonely Republican calling merely for “common-sense measures” on climate change.

Rather, Republican mayors say they fret along with their coastal Democratic counterparts about chronic high-tide inundation and the rising intensity and restoration costs of storms.

In Coral Gables, the new mayor, Republican Raul Valdes-Fauli, is a former tax attorney for oil companies. During his campaign, he said climate change “is not just a feel-good issue, it is a very vital issue for Coral Gables.”

In Palm Beach County, Steve Abrams, a Republican who was mayor and is currently a commissioner, last year told the British Guardian newspaper, “We don’t have the luxury at the local level to engage in these lofty policy debates. I have been in knee-deep water in many parts of my district during King Tide.”

In other parts of the nation far from hurricane zones, other Republican mayors have begun to speak out. Carmel, Indiana Mayor James Brainard vociferously opposed President Trump’s pullout of the Paris climate agreements. In June, he told National Public Radio that the Midwest is “at risk for all sorts of bad things,” particularly “the frequency and intensity of storms” and “the evolution of new pests in the fields outside our metro areas.”

Brainard said his city is taking many steps to fight climate change, by making the city more walkable, switching street lights to LEDs, adding parkland, and ordering all fleet vehicles to be either hybrids or use alternative fuel. He said that on a one-on-one basis at least, he stops critics of green investment in their tracks with the city’s dramatically lower electricity costs.

He told NPR that by ignoring the economic benefits of going green, the Trump administration has “missed a political opportunity to expand their base. They’re speaking to a very small portion of the Republican base, and it’s a big missed opportunity for them.”

And in America’s largest city to be governed by a Republican mayor, San Diego’s Kevin Faulconer continues to set the pace for many of his Democratic big-city counterparts with his city’s plan for all-renewable energy by 2035. Despite some local criticism that his administration is not working fast enough on bike lines, tree canopies, and giving communities more say in where their electricity comes from, Faulconer remains a standard bearer for Republicans who see that wildfires and rising seas are tied to fossil fuels. Last month, he told the prestigious Commonwealth Club in San Francisco:

“It’s time for today’s California Republicans to stop ignoring climate change. If we opt out of the conversation, we’re only going to get extreme one-party solutions. We should be proud to offer our own plans to preserve our environment—plans that don’t plunder the middle class.”

The Republican Party may not yet be proud of the likes of Faulconer and Regalado. But based on the battering over the last decade and a half of the East Coast and Gulf Coast by hurricanes, the blistering heat of the West, the increasing intensity of storms in the Midwest, and the northward spread of crop pests and diseases spread by mosquitoes and ticks, the party does not have long to opt in to the conversation.

California’s 100% Clean Energy Bill Faces Setback—But Progress Continues

UCS Blog - The Equation (text only) -

Image of California's Capitol Building Photo: Henri Sivonen/CC BY (Flickr)

The California Legislature failed to bring Senate Bill 100 (De León) for a full vote on Friday. Had the bill, SB 100 (De León), passed and been signed into law it would have accelerated the state’s primary renewable energy program, known as the Renewables Portfolio Standard (RPS), by raising the current requirement from 50 to 60 percent by 2030. It also would have set an ambitious new policy for all electricity produced in the state to come from zero-carbon resources by 2045.

Since Friday was the deadline to move bills for the regular 2017 legislative session, the bill is stalled but not dead. In fact, Assembly member Chris Holden, the chair of the committee for which the bill failed to be brought for a vote, has said the issues will be revisited in 2018.

Let’s take stock of where we are today: in 2016 California received about 25% of its electricity from eligible renewables. Another 19% came from a combination of nuclear and large hydropower, which are zero-carbon resources that would be eligible under SB 100. Statewide we are already on track to exceed the current RPS requirement of 50% by 2030. In the past several years California has made great strides to continue its position as a worldwide clean energy leader, and current policies in place ensure that the momentum will continue.

I am disappointed, but not discouraged. I spent a good bit of time working on SB 100 this year, and to me the fact that we couldn’t pass it in one year is not cause for despair. As I’ve said before, setting a goal to completely decarbonize California’s electricity sector by 2045 is bold and aspirational, and it should not be a surprise that a big new energy policy will take multiple legislative sessions to hammer out some of the details.

I am also encouraged that conversations at the end of the year were not about whether a zero-carbon electricity grid is the right path for California’s future but rather what that path should look like. I look forward to continuing the discussion and negotiation in January when the legislature returns. Reducing carbon emissions and air pollution by transitioning away from fossil fuels is one of the most important actions our country and world must take to avoid the worst consequences of climate change. While California’s share of global emissions is relatively small, transitioning completely away from fossil fuel-based electricity for the world’s sixth-largest economy would break new, important ground for other states and countries to follow. 2018 should be an exciting year.

What the Northeast Could Build With a Transportation Cap and Invest Program

UCS Blog - The Equation (text only) -

While the Northeast region struggles to make significant progress in reducing pollution from transportation, our neighbors and allies in California and Canada are investing billions of dollars in clean mobility solutions thanks to their successful implementation of a cap and invest program covering transportation emissions.

Today California finalized its plan to invest over $2 billion over the next year on initiatives designed to reduce our use of oil and pollution from transportation. These investments will make it easier for California residents to purchase an electric vehicle, or to save money by trading in an old gas guzzling car for an efficient conventional vehicle or hybrid. They will improve public transportation services, both in California’s big cities and its small towns and rural counties. They will provide more affordable housing in communities near public transportation. And they will create jobs, reduce emissions, and save consumers money.

Meanwhile, our neighbors in Ontario and Quebec are projected to spend $2.1 and $1.9 billion respectively on clean transportation programs by 2020.

These jurisdictions are making investments on a far greater scale than anything currently happening in any state in the Northeast. They are able to do so because unlike the Northeast, California, Ontario, and Quebec have enacted a comprehensive climate policy that establishes enforceable limits on pollution from transportation, holds polluters accountable for their emissions, and provides a dedicated funding source for clean transportation investments.

This policy, known as “cap and trade” but which could be more accurately called “cap and invest”, is run through the increasingly misnamed “Western” Climate Initiative (or WCI), an international carbon market that now limits emissions in a region covering over 60 million people in the United States and Canada.

Cap and invest is not new to the Northeast. Under the Regional Greenhouse Gas Initiative (or RGGI), the Northeast established the first market-based limit on pollution from power plants, and used the funds generated by the program to invest in efficiency and clean energy. Thanks in part to this policy, Northeast states have dramatically reduced pollution from electricity. Unfortunately, the Northeast states have yet to take the next logical step and enact a similar policy to limit emissions from transportation, which is now the largest source of pollution in the region.

As a result, Northeast states are missing out on an opportunity to make investments that will reduce pollution, save consumers money, increase economic growth, create jobs, improve public health, and reduce our use of oil. If the Northeast had a program similar to WCI covering transportation pollution, it could raise up to $4.7 billion every year for clean transportation initiatives in the Northeast.

Here are some of the things that we build in the Northeast with a cap and invest program:

Better transit

Unlike diesel and natural gas vehicles, electric trucks and buses, like the BYD articulated bus pictured here, produce no hazardous exhaust emissions.

At a time when we need to be making transformative investments in public transportation, the transportation agencies tasked with maintaining and expanding our public transportation systems are broken. While public transit use is near an all-time high, a variety of factors including inflation and increasing fuel efficiency are reducing real gas tax revenues. Limited transportation funding has led to several well publicized transit failures in New York City, Boston, New Jersey, and other cities in the Northeast.

State Revenues at $14.75 per ton  (million$) Transit (48%) Sustanable Communities (26%) Clean Vehicles (26%) Connecticut 246.33 118.24 64.04 64.04 Delaware 67.85 32.57 17.64 17.64 D.C. 17.70 8.50 4.60 4.60 Maine 143.08 68.68 37.20 37.20 Maryland 452.83 217.36 117.73 117.73 Massachusetts 469.05 225.14 121.95 121.95 New Hampshire 109.15 52.39 28.38 28.38 New Jersey 954.33 458.08 248.12 248.12 New York 1181.48 567.11 307.18 307.18 Pennsylvania 983.83 472.24 255.79 255.79 Rhode Island 66.38 31.86 17.26 17.26 Vermont 53.10 25.49 13.81 13.81 Total 4745.08 2277.64 1233.72 1233.72

Almost half of the transportation funding from California’s program will go towards improving public transportation services in the state. The long list of programs and projects that will be funded (at least in part) from California’s climate program includes transit expansions in every major metro region, high speed rail, bus service improvements in dozens of small towns and rural counties, replacement of diesel buses with electric buses, and programs to provide low or reduced fares for low income residents and college students.

Clean vehicles

Both California and (most) Northeast states offer rebates to make electric vehicles more affordable for drivers but California’s programs are larger, more comprehensive, and more specifically target moderate and low-income drivers. For example, low-income drivers who trade in a gas guzzler for an electric vehicle can qualify for a rebate of up to $14,000 through the state’s Enhanced Fleet Modernization Program.

California is also expanding their efforts to provide vehicle financing assistance to help residents who lack the credit to purchase or lease clean vehicles. These investments have helped California achieve electric vehicle sales numbers six times higher than the Northeast.

California also provides a rebate of up to $110,000 for businesses that replace diesel buses and trucks with zero-emission vehicles, which can have a dramatic impact on air quality in low-income communities. Finally, California is using funds from their climate program to build electric car-sharing networks in Los Angeles and Sacramento.

Sustainable communities

People want to live in communities with access to multiple transportation choices, if they can afford it. But rising demand and limited supply for transportation-accessible housing is contributing to a housing affordability crisis that is impacting every major metropolitan area in the Northeast. Five of the eight metro areas with the highest monthly rent in the United States are located in the Northeast; the other three are in California.

High housing costs have enormous implications for racial and economic equity. The cost of housing also has a significant impact on climate emissions. As families find themselves unable to afford communities with strong transportation choices they are forced to relocate to communities with cheaper rent but higher fuel consumption.

California has spent over $700 million to date from their climate program on affordable housing and sustainable community programs. The largest of these programs is the Affordable Housing and Sustainable Communities program (AHSC), which provides grants for affordable housing and bike and pedestrian infrastructure projects that reduce global warming emissions. In the most recent year in which data is available, AHSC-funded projects created 2427 affordable housing units near transit that will reduce emissions by over 800,000 metric tons.

Pollution from transportation is the largest source of emissions in the Northeast region, responsible for over 40 percent of our total emissions. Solving this problem is going to require bold new policies to transition our transportation system away from gas guzzling automobiles towards electric vehicles, transit, and sustainable communities. Cap and invest is a policy model that has been proven to be effective, in the Northeast under RGGI, and as a strategy to reduce transportation emissions in California, Ontario and Quebec. We encourage the Northeast states to consider adopting this model as a key component of our strategy to promote clean transportation in the region.

We Visualized the US Nuclear Arsenal. It’s Not Pretty.

UCS Blog - The Equation (text only) -

International security experts often refer to the twin goals of military policy: to minimize the risk of war and to minimize the damage should war start.

Because nuclear weapons are so destructive, the goal must be to eliminate—and not just minimize—the risk of nuclear war, which will require eliminating nuclear weapons.

Until then, it is essential that nations with nuclear weapons minimize both the risk and consequences of a nuclear war.

Numbers matter.

The consequences are directly related to the numbers of weapons used—which is limited by the number of weapons a nation has. Depending on the targets, the use of even a small number of weapons can result in horrific consequences.

For example, climate scientists using the latest climate models find that if India and Pakistan each used 50 of their weapons against the other’s cities, fires would inject so much soot into the atmosphere that the global climate would be affected for a decade. The decreased sunlight and lower temperatures would result in lower agricultural productivity and could lead to the starvation of over 1 billion people. This would be in addition to the people directly killed by the weapons.

Policy-makers and military officials often refer to the US nuclear arsenal as “our deterrent,” as if it were some sort of disembodied force rather than actual weapons. A “deterrent” is obviously a good thing, whereas “nuclear weapons” are more problematic. So, let’s take a look at what this “deterrent” actually consists of.

We have a new web graphic that displays all the US nuclear weapons. It provides a step-by-step visualization of the weapons the US deploys on land-based missiles in underground silos, on submarines, and on aircraft. All of these 1,740 weapons are ready for use.

But that is not all. The graphic then adds in the weapons the US keeps in storage for potential future use.

It all comes to a whopping 4,600 nuclear weapons.

Take a look—you can find more detail about the arsenal on the final page by hovering over each dot.

Policy matters, too.

Policy is what determines the risk of nuclear war.

As the graphic notes, the US keeps its land-based missiles on hair-trigger alert. Why? To allow the option of launching all these weapons in response to warning of an incoming attack from Russia. The warning is based on data from US satellites and ground-based radars, which is processed by computers.

No problem there, right? Wrong—not surprisingly, there have been false alarms in the past. Even more troubling, it takes only some 25 minutes for a missile to travel between Russia and the US. By the time the data has been analyzed, the president has about 10 minutes to decide whether or not to launch US missiles.

Which brings us to the next issue highlighted by the graphic: the president has the sole authority to use the 1,740 deployed nuclear weapons—meaning he or she can order an attack of any kind without the input of anyone else. Unless there is reason to think the president is incapacitated (e.g., drunk), the military is obligated to follow orders and launch.

Finally, it turns out that US nuclear weapons are not just a “deterrent” to dissuade other countries from using nuclear weapons first because the US could respond in kind. US policy also allows the first use of nuclear weapons against another country.

The US could reduce the risk of nuclear war by changing these three policies—by removing its land-based missiles from hair-trigger alert and eliminating launch-on-warning options from its war plans; by requiring the involvement of other people in any decision to use nuclear weapons; and by adopting a no-first-use policy. To reduce the potential consequences of war, it would need to dramatically reduce its arsenal.

And taking these steps would still leave the US with a strong “deterrent.”

Consumer Product Safety Commission Takes On Flame Retardants

UCS Blog - The Equation (text only) -

In 2014, Earthjustice and Consumer Federation of America, on behalf of a broad coalition of health, consumer, science and firefighter organizations, petitioned the Consumer Product Safety Commission (CPSC) to ban a class of flame retardants, additive organohalogen flame retardants, from children’s products, furniture, mattresses, and electronic casings as hazardous substances.

Graphic: Consumer Federation of America

The CPSC is considering whether or not to grant the petition and held a public hearing yesterday, at which I testified regarding the risks of flame retardants in households and the way in which flame retardant manufacturers and their lead trade association, the American Chemistry Council (ACC), have fought hard to keep these hazardous products on the market. The ACC is not a stranger to using the disinformation playbook. As we documented in our 2015 report, Bad Chemistry, the ACC has wielded influence to delay and quash important safeguards on a long list of chemicals, has funded science to exaggerate the chemicals’ effectiveness at lowering fire risk, and has employed innocuous-sounding front groups to do its dirty work without disclosing its relationship. A 2012 Chicago Tribune series did an excellent job of bringing much of the trade association’s activities to light.

The Commission will vote next week to determine whether they will grant the petition and begin to develop proposed rulemaking to ban these chemicals. We hope that the Commission will heed the recommendations of a long list of scientists, public health, and legal experts who agree that the CPSC has the legal authority and the scientific backing to ban these chemicals.

My testimony is below.

Good afternoon, I would like to thank Chairwoman Buerkle and the CPSC Commissioners for the opportunity to testify before you today on this important issue. My name is Genna Reed. I am the science and policy analyst at the Center for Science and Democracy at the Union of Concerned Scientists. With more than 500,000 members and supporters across the country, we are a national, nonpartisan, non-profit group, dedicated to improving public policy through rigorous and independent science. The Center for Science and Democracy at UCS advocates for improved transparency and integrity in our democratic institutions, especially those making science-based public policy decisions.

The Union of Concerned Scientists stands with other members of the scientific community in supporting this petition calling upon the Consumer Product Safety Commission (CPSC) to declare organohalogen flame retardants (OFRs) as a hazardous class of chemicals and to ban their use in children’s products, furniture, mattresses and the casings surrounding electronics. The scientific evidence laid out in the petition supports this regulatory change. The CPSC has the authority to protect the public from toxic substances that “may cause substantial personal injury or substantial illness.”

Since the Center’s inception, we have worked to protect scientific integrity within the federal government and called attention to incidences of special interests mischaracterizing science to advocate for specific policy goals. The chemical industry and its trade association, the American Chemistry Council’s, work to sow doubt about the science revealing harms about chemicals’ impacts on our health, including flame retardants, is an egregious example of this inappropriate behavior.

The companies that manufacture OFRs have put significant time and money into distorting the scientific truth about these chemicals. As a 2012 Chicago Tribune investigative series noted, the chemical industry “has twisted research results, ignored findings that run counter to its aims and passed off biased, industry-funded reports as rigorous science.” In one case, manufacturers of flame retardants repeatedly pointed to a decades-old government study, arguing the results showed a 15-fold increase in time to escape fires when flame retardants were present. The lead author of the study, however, said industry officials “grossly distorted” the results and that “industry has used this study in ways that are improper and untruthful,” as the amount of flame retardant used in the tests was much greater than would be found in most consumer items. The American Chemistry Council has further misrepresented the science behind flame retardants by creating an entire website to spread misleading ideas about flame retardants as safe and effective, even though research has consistently shown their limited effectiveness. In doing so, the American Chemistry Council and its member companies have promoted the prevalent use of OFRs at the expense of public health.

Looking at these chemicals through a strictly objective lens illustrates the need for CPSC’s swift action. Toxicity and exposure data support the assessment of organohalogen flame retardants as a class of chemicals under the Federal Hazardous Substances Act (FHSA). Properties that are shared by OFRs include their semivolatility and ability to migrate from consumer products into house dust and exposure has been associated with a range of health impacts including reproductive impairment, neurological impacts, endocrine disruption, genotoxicity, cancer, and immune disorders.  As a class, there is an adequate body of evidence supporting the conclusion that these chemicals have the “capacity to cause personal illness” and therefore meet the definition of “toxic” under FHSA. Perhaps most egregiously, biomonitoring data have revealed that communities of color and low-income communities are disproportionately exposed to and bear high levels of flame retardant chemicals, adding to the cumulative chemical burden that these communities are already experiencing, from increased fine particulate matter from power plants or refineries in their neighborhoods to higher levels of contaminants in their drinking water.

I’ve seen firsthand the persistence of the earliest form of flame retardants, polychlorinated biphenyls (PCBs), that still plague the sediment and water of the Hackensack Meadowlands just a couple of miles from where I grew up in New Jersey. One of my first jobs was working in the chemistry division of the Meadowlands Environmental Research Institute where I spent my days extracting PCBs and organochlorine pesticides from the soil and sediment of the Meadowlands and analyzing that data. Despite being banned in 1977, these chemicals are still found in dangerously high amounts all over industrial hotspots of the country, and continue to bioaccumulate in a range of species. The ban of PCBs happened decades ago and we are still managing the damaging impacts of the chemical’s prevalence across the country. The next generation of these chemicals, organohalogen flame retardants, are inside of our own homes in a range of products, thanks largely in part to the disinformation campaign sowed by special interests. The fact remains that the science does not support their continued use.

Seeing firsthand the persistence of PCBs in my local environment inspired me to use my scientific training to work to design or improve policies that minimize public health and environmental risks to prevent future scenarios of chemicals overburdening ecosystems and households. That is why I’m here today to ask the CPSC to act with urgency to grant this petition and further regulate OFRs to protect our children and future generations.

Thank you.

Western Wildfires Add to Disasters the Nation Faces: Will Congress Take Action?

UCS Blog - The Equation (text only) -

Glacier National Park Fire. Photo: Brett Timm, National Park Service

Record wildfires are now burning across a large swatch of the Western US, even as the Southeast and Gulf coasts of the US are struggling to recover from Hurricanes Harvey and Irma. Yesterday the Forest Service announced that firefighting costs have already topped $2 billion in 2017, for the first time ever. Earlier this week Senators Daines and Tester separately called for action to address wildfire threats, the latest of similar bipartisan efforts. Congress needs to finally get wildfire funding and forest management legislation across the finish line without delay.

Raging wildfires

This year’s western wildfire season is already setting records. Thus far, nearly 49,000 wildfires have burned across over 8.3 million acres. For context, the 10-year average (2006-2016) for burned area was about 5.5 million acres a year. There are now 64 active fires across 10 states. More than 21,000 firefighters have been deployed to help contain these fires.

Some large fires have been burning for over a month and are not expected to be fully contained until mid-October. Some of the larger ones include:

Smoke from wildfires lingers over the west coast of the U.S. and Canada. NASA image courtesy Jeff Schmaltz, MODIS Rapid Response Team.

  • The Diamond Creek Fire in Washington which has burned 109,000 acres since July 23rd. It is only 30 percent contained.
  • The Eclipse Complex Fire in California (which includes the Cedar Fire, the Oak Fire and the Abney Fire) which has burned 96,529 acres since August 15th and is 25 per cent contained.
  • The Chetco Bar Fire in Oregon in which has burned 185,920 acres since July 12th and is just 12 percent contained.
  • The Rice Ridge Fire in Montana which has burned 155,900 acres since July 24th and is 40 percent contained.
  • The Highline Fire in Idaho which has burned 84, 619 acres since July 28th. This is one of several fires that have broken out in the Payette National Forest this summer.

And those wildfires don’t just affect the West. Images from  NOAA-NASA’s Suomi NPP satellite show how the smoke is being carried by the jet stream 3,000 miles across the country to the East Coast.

Disaster declarations have been made for a number of the largest wildfires, triggering FEMA Fire Management Assistance to help pay up to 75 percent of a state’s eligible firefighting costs.

Wildfire funding and forest management

As I’ve said in previous blogposts, intense wildfire seasons are severely straining the capacity of federal agencies to respond and forcing them to reallocate budgets away from actions such as forest management that could help limit future wildfires risks. Our policies and funding mechanisms need to catch up to the new realities of a climate-altered world.

The Forest Service has already announced that it will not have enough money to cover the rest of this fire season without borrowing from other areas of its budget that help reduce future wildfire risks—unless Congress acts.

In a statement yesterday that announced the Forest Service’s record-breaking spending on firefighting this year, Agriculture Secretary Sonny Perdue said:

Forest Service spending on fire suppression in recent years has gone from 15 percent of the budget to 55 percent – or maybe even more – which means we have to keep borrowing from funds that are intended for forest management. We end up having to hoard all of the money that is intended for fire prevention, because we’re afraid we’re going to need it to actually fight fires.  It means we can’t do the prescribed burning, harvesting, or insect control to prevent leaving a fuel load in the forest for future fires to feed on.  That’s wrong, and that’s no way to manage the Forest Service.

Hotter, drier conditions fueling dangerous wildfires

Hotter, drier conditions, exacerbated by climate change are contributing to worsening wildfire seasons. Earlier this year, we saw an extreme heat wave accompanied by terrible wildfires in parts of the Southwest and California. In early September, record breaking heat in California and the Pacific Northwest again contributed to a spate of large fires in the area, many of which are still burning even as fresh ones break out.

According to the latest US Drought Monitor report: For the last 3 months, precipitation totals were among the lowest 2 percent on record in a broad area from most of Montana westward across central and northern Idaho, Washington, and the northern half of Oregon. 

Increased development in fire-prone areas and lack of resources for maintaining healthy forests is also contributing to worsening risks.

A season of disasters raises the stakes to talk about climate change

Our nation’s ability to respond to multiple, simultaneous disasters is being seriously tested. And while many courageous men and women are doing an incredible job on the frontlines of these disasters—thank you to firefighters, linemen, national guard members and many, many other first responders—our policymakers are still falling short.

The immediate focus is appropriately on disaster response. This is also the right time to ensure that we are implementing smart, forward-looking recovery and preparedness policies, and funding them adequately.

This is also exactly when we should be talking about how climate change is exacerbating risks to people and property. How else can we help ensure we’re doing a better job of protecting people from future disasters?

Congress must act on wildfires

This year’s wildfire season is a fresh reminder of why Congress needs to act expeditiously to fund firefighting and forest management robustly. The stakes are rising as climate change exacerbates the risks of costly and dangerous fires and we can’t afford to put off action for yet another year.

In a recent floor speech, Senator Wyden said:

“It feels like we’ve been at this longer than the Trojan War. The bottom line is the West cannot wait any longer for Congress to send them some help and repair—for the long-term—this broken system that shortchanges prevention and adds fuel to these raging wildfires.”

The good news is that there is widespread bipartisan support for a wildfire funding fix, as this letter to Majority and Minority Senate leaders shows.  Now let’s get this done.

 

 

Truck and Bus Legislation to Watch in California

UCS Blog - The Equation (text only) -

Today’s the last day of the California legislative session. It gets hectic in Sacramento this time of year, but here are two bills I’m paying attention to that could help reduce air pollution and global warming emissions from heavy-duty vehicles.

As a reminder, heavy-duty vehicles make up just 7 percent of vehicles in California but disproportionately contribute to global warming emissions and air pollution, contributing 20 percent of global warming emissions from the transportation sector, for example. And as we work to improve public health, we must also remember that communities of color are disproportionately exposed to pollution through proximity to roadways, ports, warehouses, and other sources of emissions.

Cleaning up state-owned trucks and buses

That’s what Assembly Bill 739 by Assembly member Ed Chau would do. This bill sets a target for zero-emission trucks and buses purchased by the state: 15 percent of purchases made in 2026-2030 and 30 percent of purchases made in 2031 and later. This is an achievable target with eight years’ worth of technology development and agency planning to enable its implementation.

The target would apply to vehicles with gross vehicle weight ratings (the maximum weight at which a fully loaded vehicle is rated to operate) above 19,000 lbs. For a sense of scale, think transit buses, large U-Haul-type trucks, garbage trucks, etc. The bill only applies to state-owned vehicles, which includes everything from buses at the Cal State universities to work trucks operated by the Department of Parks and Recreation and Caltrans. The purchase goals do not apply to vehicles with special performance requirements necessary for public safety, such as fire trucks operated by the Office of Emergency Services.

This bill walks the talk. There’s been a lot of planning and workshops on how to get zero-emission trucks and buses on the road in California, from the Sustainable Freight Action Plan to standards for trucks, buses, and airport shuttles. This bill holds the state fleet to a similar standard.

It is important to note that the 15 percent and 30 percent targets in this bill apply only to purchases, not the overall composition of the state’s fleet. Suppose a given type of vehicle typically lasts 14 years. This means roughly 7 percent of those vehicles are turned over each year. A 15 percent purchase target in this case corresponds to 1 percent of the total fleet (15 percent of 7 percent).

There are many zero-emission heavy-duty vehicles already commercially available today and more on the way. Cummins recently unveiled an electric truck and Tesla will reveal its electric truck with a 200-300 mile range at the end of next month. Many other major companies have also signaled their interest in zero-emission trucks, including Daimler, Peterbilt, and Toyota.

Large scale funding for clean vehicles

That’s what recent amendments to Assembly Bill 134 (the budget bill) would do. The legislation proposes $895 million in funding for clean vehicles using revenue from the state’s cap and trade program. If that sounds like a lot of money, it is compared to previous years ($680 million for the last four years combined). But it’s not compared to the level of action needed for the state to meet its air quality and climate goals.

Oversubscribed incentive funding programs that offset the upfront purchase cost of electric trucks, buses, and cars for businesses and consumers receive much-needed funding in this bill, including $180 million for the Hybrid and Zero-Emission Truck and Bus Voucher Incentive Project (HVIP). This program provides rebates for medium- and heavy-duty vehicles, with zero-emission trucks and buses receiving larger incentives than combustion technologies. The $35 million in HVIP designated for zero-emission transit buses alone could allow half of the roughly 700 buses purchased in California over the next year to be battery electric vehicles.

The budget bill also includes $140 million for the Clean Vehicle Rebate Program (CVRP), which provides consumers with rebates for plug-in hybrid electric, battery electric, and fuel cell electric passenger cars. This program has helped put over 200,000 clean cars on the road in California since 2010. There’s a lot more in the budget bill for clean vehicles ($575 million), but the CVRP and HVIP programs are ones UCS has been especially involved with.

These two bills are very different in scale – AB 739 applying to a fraction of state-owned vehicles and the budget bill providing incentives to businesses and consumers for vehicles across the light-, medium-, and heavy-duty classes. But to reach the end goal of clean air for all Californians and dramatically reduced climate emissions, we need actions that span all scales.

Jeff Turner/CC BY 2.0 (Flickr) A Caltrans diesel dump truck. Photo: California Department of Transportation

North Korea’s Sept. 15 Missile Launch over Japan

UCS Blog - All Things Nuclear (text only) -

North Korea conducted another missile test at 6:30 am September 15 Korean time (early evening on September 14 in the US). Like the August 28 test, this test appears to have been a Hwasong-12 missile launched from a site near the Pyongyang airport. The missile followed a standard trajectory—rather than the highly lofted trajectories North Korea used earlier this year—and it flew over part of the northern Japanese island of Hokkaido (Fig. 1).

Fig. 1. Approximate path of the launch.

The missile reportedly flew 3,700 kilometers (km) (2,300 miles) and reached a maximum altitude of 770 km (480 miles). It was at an altitude of 650 to 700 km (400 to 430 miles) when it passed over Hokkaido (Fig. 2).

Fig. 2. The parts of Hokkaido the missile flew over lie about 1,250 to 1,500 km (780-930 miles) from the missile launch point.

The range of this test was significant since North Korea demonstrated that it could reach Guam with this missile, although the payload the missile was carrying is not known. Guam lies 3,400 km from North Korea, and Pyongyang has talked about it as a target because of the presence of US forces at Anderson Air Force Base.

This missile very likely has low enough accuracy that it could be difficult for North Korea to use it to destroy this base, even if the missile was carrying a high-yield warhead. Two significant sources of inaccuracy of an early generation missile like the Hwasong-12 are guidance and control errors early in flight during boost phase, and reentry errors due to the warhead passing through the atmosphere late in flight. I estimate the inaccuracy of the Hwasong-12 flown to this range to be likely 5 to 10 km, although possibly larger.

Even assuming the missile carried a 150 kiloton warhead, which may be the yield of North Korea’s recent nuclear test, a missile of this inaccuracy would still have well under a 10% chance of destroying the air base. (For experts: This estimate assumes the air base would have to fall within the warhead’s 5 psi air blast radius, which is 3.7 km, and that the CEP is 5 to 10 km.)

Heating of the reentry vehicle

As I’ve done with some previous tests, I looked at how the heating experienced by the reentry vehicle (RV) on this test compares to what would be experienced by the same RV on a 10,000 km-range missile on a standard trajectory (MET). My previous calculations were done on North Korea’s highly lofted trajectories, which tended to give high heating rates but relatively short heating times.

Table 1 shows that in this case the duration of heating (τ) would be roughly the same in the two cases. However, not surprisingly because of the difference in ranges and therefore of reentry speeds, the maximum heating rate (q) and the total heat absorbed (Q) by the RV on this trajectory is only about half that of the 10,000 km trajectory.

Table 1. A comparison of RV heating on the September 15 missile test and on a 10,000 km-range trajectory, assuming both missiles have the same RV and payload. A discussion of these quantities can be found in the earlier post.

So while it seems likely that North Korea can develop a heat shield that would be sufficient for a 10,000 km range missile, this test does not demonstrate that.

Why Does the Cost of Offshore Wind Keep Dropping?

UCS Blog - The Equation (text only) -

The latest costs for new offshore wind farms are mighty impressive. How come offshore wind costs just keeps going down?

Records were meant to be broken

The UK just held its latest auction for power from future projects based on a range of low-carbon technologies beyond the usual suspects like solar and land-based wind.*

The UK auction results were quite something: The winning bids included not one but two offshore wind projects whose developers agreed to a contract price of ­­£57.50 per megawatt-hour (2012 prices)—around 7.7 US cents per kilowatt-hour. That’s half the cost for offshore wind projects in a round of bidding in the UK just two years ago, and within striking distance of—or lower than—the cost of almost any source of new “conventional” power.

So how does this happen? Why does the cost of offshore wind keep getting lower, and so quickly?

Bigger, stronger, faster

Those latest record breakers, the proposed Moray and Hornsea Two offshore wind projects, offer some strong clues about possible paths to lower costs:

  • Larger turbines. The two new projects might use 8-megawatt wind turbines, as did one project that just came online. That’s a big step up from the standard of just a few years ago. And larger turbines are likely on the way (and maybe even much larger ones). Larger turbines mean more power from each installation—each footing, each tower, each trip to install pieces of it, and then to maintain it.
  • Larger projects. Moray will be a really impressive 950 megawatts. Hornsea Two will be a stunning 1386 megawatts—likely the largest offshore wind project in the world when it goes online (and enough to power more than 1.4 million UK homes). Larger projects mean likely economies of scale on lots of pieces, making better use of the installation crews and equipment, covering more ground (or water) with given maintenance personnel, and spreading all the project/transaction costs over more megawatts.
  • Faster project timelines. Both of these new projects are supposed to come online by 2022/23, which is amazingly quick (and not just by US standards). Faster timelines mean less zero-revenue time before the blades start turning and the electrons start flowing (and the dollars/pounds start coming in).
  • Lots of offshore wind projects in place already. The latest projects will join a national mix that includes 5100 megawatts of offshore wind providing 5% of the UK’s electricity. Plenty of experience offshore means there’s a developed and growing industry in the UK and much of the necessary infrastructure for manufacturing components, moving them into place, and getting the electricity to shore.
  • Comfortable investors. With all the UK experience to date, investors know what they’re getting into. The UK government, offering these contracts, is about as solid a guarantor for the revenue stream as investors could ever hope to see. Comfortable investors = lower financing costs = lower prices for consumers.

Lots of tailwinds for offshore wind. So what might be pushing things in the other direction—counterbalancing (partly) all those cost gains?

Two have to do with project sites. As near-shore sites get taken, projects end up farther from land, meaning more shipping time to get personnel and materials to the project site, and longer power lines to get the electrons back to land, and higher associated costs. New sites might also be in deeper water, which means more tower costs (or even floating turbines!).

UK wind farms and instantaneous output (Source: The Crown Estate). Click to enlarge.

On the plus side, better wind speeds are also a factor in cutting offshore wind costs, and being further out can mean even better winds.

The UK doesn’t seem to be in danger of running out of suitable sites, in any case, and technologies seem to be evolving to keep up with changing site characteristics.

Meanwhile, back in the U.S. of A.

What’s this latest offshore wind news mean for those of us on this side of the pond? The biggest takeaway, maybe, is that we can do more when we do more.

As UCS and plenty of others have argued, we really benefit by offering the US offshore wind industry a clear path not just to one or two projects, but to the robust levels of installation and clean energy that we know we need. That long-term outlook can allow them to make the kind of investments (and attract the investors) to build not just projects, but an industry.

And with each project, it becomes easier to envision the next one. Massachusetts has structured its 1600-megawatt offshore wind requirement with multiple tranches to take advantage of this effect. The first round, maybe 400 megawatts (for which bids are currently being prepared), is likely to pave the way for a cheaper second round, and a third round that’s cheaper still.

New York is offering a path to even larger scale, with its recent commitment to 2400 megawatts of offshore wind.

As the experience in the UK and elsewhere is showing, more and bigger projects, larger overall targets, and greater clarity for the industry can lead to economies of scale, more local manufacturing and stronger local infrastructure, and more comfortable investors for US markets.

And that can all add up to more cost-effective offshore wind for us all.

*Can I just say how great it is to be in a place where solar and wind are “usual suspects”? We are definitely making progress.

The Good, Bad, and Ugly Self-Driving Vehicle Policy

UCS Blog - The Equation (text only) -

A Waymo self-driving car on the road in Mountain View, CA, making a left turn. CC-BY-2.0 (Wikicommons).

Automakers and their advocates have been busy in the halls of Congress and Department of Transportation. The U.S. House of Representatives passed legislation that will make it easier for self-driving cars to hit the road, the Department of Transportation replaced an Obama-era self-driving vehicle policy with a more industry-friendly approach, and the Senate had a hearing on a bill that would also speed the deployment of self-driving vehicles, including trucks.

The Good News

The bill that passed the House and the bill being considered in the Senate include some positive provisions. For example, each establish an expert committee that will be tasked with identifying how self-driving vehicles could affect: mobility for the disabled and elderly, labor and employment issues, cybersecurity, the protection of consumer privacy, vehicle safety, and emissions and the environment. Establishing a structure for a Department of Transportation-led committee to examine these issues is important for informing future self-driving vehicle policy that can help this technology create positive outcomes and avoid its potential consequences.

Both bills also draw a brighter line between federal and state authority related to vehicle safety. The way this division works for regular cars today is that the federal government regulates the vehicle and states regulate the drivers. But this distinction doesn’t quite work with self-driving vehicles, because who is the driver? The person sitting in the driver’s seat, eating pita chips and watching Netflix while the car drivers itself? Or is it the vehicle itself?

To better clarify the distinction between federal and state authority, both the House and Senate bills give control over the design, construction, and performance of self-driving vehicles and self-driving technology to the federal government. States retain their right to enact laws related to how these vehicles are registered, who can use them, and how they interact with state or local roads and infrastructure. However, states would be preempted from enacting any law that can be read to be an “unreasonable” restriction on the design, construction, or performance of a self-driving vehicle.

Self driving vehicles are set to hit the road sooner than you may think. Companies like Google, Uber, Ford, and Tesla are all rushing to get the best self-driving vehicle on the market. Image via; https://commons.wikimedia.org/wiki/File:Driving_Google_Self-Driving_Car.jpg

The last bit of good news is that the bills require automakers to submit detailed cyber-security and safety evaluation reports to the Department of Transportation. The bills also note the need to inform consumers of the capabilities and limitations of self-driving vehicle systems, so that users better know when the system can be engaged or needs to be turned off.  In fact, the National Transportation Safety Board recently found that Tesla’s autopilot lacks the appropriate safeguard to prevent drivers from using it improperly.

The Bad News

It wouldn’t be federal legislation if there wasn’t something bad tucked in, and both the House and Senate self-driving vehicle bills have some potentially dangerous provisions.

Both bills allow self-driving vehicles to be granted exemptions from federal motor vehicle safety standards (FMVSS). Any vehicle, whether self-driving or not, can be granted an exemption from FMVSS, and the law currently allows up to 2,500 exemptions per manufacturer per year.

Self-driving cars will surely need FMVSS exemptions. They might not have a steering wheel, for example, so they couldn’t possibly comply with the FMVSS for steering wheels and, as a result, couldn’t be tested or sold in the U.S. The whole FMVSS playbook will likely need to be updated by the Department of Transportation to respond to self-driving vehicle technology. But before then, self-driving vehicle makers will look for exemptions to sell their product.

The problem is the number of exemptions that the House and Senate bills are offering self-driving vehicle manufacturers. Both bills would grant a single manufacturer up to 100,000 exemptions from FMVSS after a couple years. (The Senate bill starts with 50,000 in year 1, for example.) This means that an automaker could make a self-driving vehicle and exempt it from any safety regulation that would “prevent the manufacturer from selling a motor vehicle with an overall safety level at least equal to the overall safety level of nonexempt vehicles.” Given that self-driving vehicles will likely have similar, if not better, safety ratings than regular vehicles, I could see this language as having very broad appeal for getting the Department of Transportation to approve exemption requests.

Exempting self-driving cars from FMVSS for testing purposes makes sense, but the quantity of exemptions allowed in the House and Senate bills is excessive. Once self-driving cars are on the road, there’s no putting the self-driving genie back in the bottle. Transportation analysts, academics, the government, and the public need to better understand the safety, congestion, labor, and other impacts that self-driving vehicles will create before automakers get a free pass to each put 100,000 self-driving vehicles on the road.

Limiting the number of FMVSS exemptions closer to the current cap of 2,500 per manufacturer would put the introduction of self-driving vehicles at a pace to better understand how they function in actual driving conditions, not on the test track (or test city). In addition, several groups and two former heads of the National Highway Traffic Safety Administration have expressed skepticism that the agency even has the resources to process additional FMVSS exemptions or conduct adequate oversight in this area.

The Ugly News

In 2016, the Obama-led Department of Transportation put together a thoughtful, lengthy memo that detailed where the Department was headed on self-driving vehicle regulation. Earlier this week, the Department tossed that out the window and replaced it with a streamlined set of voluntary guidelines that self-driving companies should seek to follow.

Like the Obama-era guidance, nothing in the new federal guidance is mandatory. But unlike the previous guidance, the new guidance isn’t very specific. Consumer advocates like Consumer Watchdog and Consumers Union lambasted this approach as being a handout for industry, and they have a point. The guidance “encourages” the industry to do a lot of things, like collect data on when self-driving vehicles malfunction or crash, or submit a “voluntary” safety self-assessment that isn’t subject to any sort of federal approval.

Overall, the tone and vagueness of the document, combined with the choice to just throw out, and not build upon, the previous self-driving vehicle guidance puts this move by the Department of Transportation squarely in the ugly category.

Why Did Hurricane Irma Leave so Many People in the Dark?

UCS Blog - The Equation (text only) -

The National Hurricane Center issued its final advisory for Irma on Monday night, September 11, but for millions of people left in the storm’s wake, the disaster remains far from over. One stark reminder? Power outages. Everywhere.

Across the Caribbean, through the entirety of Florida, up into Georgia, and spreading into the Carolinas, Irma ripped power from the people.

Seventeen million people, at its peak.

Which means 17 million people without air conditioners in the sweltering heat and humidity, 17 million people without refrigerators keeping food and medicine safe, 17 million people without lights at home or along the roads, 17 million people without internet to stay informed, 17 million people suffering business interruptions and loss, and 17 million people without the assurance of critical infrastructure dependent on power—first responders, hospitals, drinking water, sewage—being able to keep their operations going. We’ve already seen the tragedy that can occur when these systems fail, with the loss of eight lives at a nursing home unable to cope, powerless in the oppressive Florida heat.

Following herculean round-the-clock efforts of the largest assembly of restoration workers in history, the lights are starting to flicker back on across the Southeast. But questions about these outages—how many, why, for how long, and critically, could it have gone better—abound. Here, a quick run-down of what we know, what we don’t, and what we’ll be looking to see in the days, weeks, and months to come.

How big was this power outage and how long will it last?

Current estimates place the number of people impacted by outages from Irma at more than 16 million across the southeastern US. When you add in outages across the Caribbean, where homes and infrastructure have seen even more severe damage, the count climbs higher to 17 million. It will take some time to get final official numbers, but the rough-cut already confirms a mind-bogglingly high number of people got left in the dark.

Just how high? When we compare customer outage counts (which is different from people; utilities tally each account as one “customer,” but accounts can represent multiple people living in the home or working in the business located behind the meter) from some major recent storms, Irma’s preliminary 8.956 million across five states, Puerto Rico, and the US Virgin Islands looks like it will probably top the list:

  • Sandy (2012): 8.66 million customers
  • Irene (2011): 6.69 million customers
  • Gustav (2008): 1.1 million customers
  • Ike (2008): 3.9 million customers
  • Katrina (2005): 2.7 million customers
  • Wilma (2005): 3.5 million customers
  • Rita (2005): 1.5 million customers

But here’s a critical point. In many ways, the duration of an outage determines the severity of its consequences. Lights out for a night? For most: an inconvenience. Lights out for several days, a week, or even longer? The triggering of a cascade of disastrous and potentially life-threatening consequences. And in a comparison of the 2005 and 2008 hurricane seasons below, we can see clearly that across storms, the initial magnitude of peak outages does not necessarily align with the subsequent duration borne by large numbers of people:

A comparison of peak outages, and outage durations, from a series of 2005 and 2008 hurricanes. Credit: DOE OE/ISER.

Right now, we know that the peak number of customers experiencing outages from Irma tops those tallied in the storms above, but we don’t yet know how long all of these outages will last. Already utilities have returned millions of people to power across the Southeast—Wednesday evening’s situation report had the total number without power at over 4.2 million; down steeply from its peak, yet still high—and are predicting that many more will be restored by the end of this weekend. Still, the utilities have flagged that they expect some segment of customers will remain without power for yet another week, or a full two weeks after the storm initially blew through.

One thing to watch? Who’s left in the dark the longest. The order in which customers get returned to power can have life-threatening consequences. Tragically, lives have already been lost from these outages. As coordination between utilities and local governments grow, in addition to prioritizing critical infrastructure, it is imperative to identify those populations most in need of attention—including the elderly, those with disabilities, and low-income populations—to help ensure prioritized and equitable attention for those who are least able to cope with the aftermath of severe weather events.

What caused these widespread outages?

Severe storms can present many and varied threats to the electricity system, from high winds, trees, and flying debris taking down power lines, to storm surge and inland flooding laying siege to substations, transformers, buried power lines, and even power plants. And in a centralized grid, where electricity from large power plants gets routed along transmission and distribution lines until it finally reaches a customer at the end of the wire, outages occurring along any part of the system can ripple down the line.

We know from a previous UCS analysis of the southeastern Florida and Charleston and South Carolina Lowcountry electricity grid that critical electrical infrastructure is located in areas highly susceptible to flooding from storm surge. However, in some places Irma ended up sparing significant storm surge, yet still the power went out. Why?

Wind, for one. Heavy winds can snap poles, send trees crashing onto wires, loft dangerous flying debris, and otherwise rip lines from homes and businesses. But flooding almost certainly contributed in places as well, especially in locations where storm surge and rainfall was worse. And finally in some places, utilities themselves may have caused the outages by pre-emptively cutting power to parts of the grid to better protect potentially inundated infrastructure.

Hurricane Irma restoration in Fort Lauderdale, FL, on Sept. 11, 2017. Credit: Florida Power and Light.

Depending on the causes of failure, and whether there existed many scattered problems versus several centralized disturbances, the length of repairs—and thus the time until restoration—can vary.

We will be waiting to review the utility’s system assessment following the restoration effort to see, in particular, where the major vulnerabilities in the system were concentrated, which can help us understand what went wrong, what went right, and where more attention must be focused in the future—so stay tuned for updates here.

Utilities in Florida  invested billions to storm-harden the grid. Do these outages mean it was a waste?

Following the catastrophic 2004 and 2005 hurricane seasons, Florida took steps to require its utilities to more closely consider storm preparedness. This resulted in several new requests from the state’s Public Service Commission, including a requirement for utilities to adhere to a vegetation management plan (i.e., requiring diligent, intentional tree-trimming schedules), and a requirement that utilities present an annual accounting of storm hardening efforts across their systems.

In response, Florida Power & Light (FPL), the largest utility in the state, has invested on the order of $3 billion since then, with other utilities in the state following suit. In FPL’s case, this has meant replacing thousands of wooden poles with concrete, burying dozens of main power lines, upgrading hundreds of substations with flood-monitoring equipment to pre-emptively shut off power (and thus avoid far worse outcomes than if such equipment were inundated while energized), and installing smart-grid devices throughout the system to help pull back the curtain on where outages are and how to work around them.

So how, then, do we square these $3 billion in investments with the fact that over the course of this storm, a staggering 4.45 million out of 4.9 million FPL customers were affected by outages? Were all the investments, borne on the backs of ratepayers, in vain?

Almost certainly not. For one, where FPL’s investments in grid hardening overlapped with increases in system resilience—or the development of a grid that is flexible, responds to challenges, and enables quick recoveries—these upgrades can help the utility restore power faster. That’s critical for lessening the impact of outages, especially for vulnerable populations, even if it doesn’t lessen the initial scope.

Still, there will be lots to consider after the restoration process is over, and once we have had a chance to see where outages persisted, and why. We will also then be able to study how this restoration evolved compared to previous efforts, and where attention should be focused in the future. At the same time, we already know that utilities have been insufficiently factoring climate change into their current infrastructure plans, leaving today’s investments vulnerable to tomorrow’s conditions. And that, we know, must change.

Is this the future we must accept, or are there things we know we can do better?

In addition to tragic loss of life and property, Hurricane Irma has also forced the reckoning of a new round of questions relating to storm preparedness in a warming world. On the one hand, it is impractical to perfectly protect our electricity infrastructure against all possible power outage threats, and though it’s too soon to tell the degree to which the widespread power outages following Irma could have been avoided, it is possible to accept that such a large storm would have at least resulted in some. (And it’s worth noting that Irma itself could have been far more devastating to parts of the coastal grid had the storm’s path not changed—the performance here should not be evidence of the worst that can happen, as we know a future storm could lay bare other paths of exposure.)

At the same time, we know that prolonged power outages can have catastrophic consequences. In particular, the critical infrastructure upon which we all depend, and the vulnerable populations for whom lasting outages can have the most severe affects, simply cannot be left to chance. We should not, cannot, accept that lives will be lost because the power stayed out.

So where do we go from here?

We put a focus on resilience. Now this is a big conversation, and one demanding attention on many fronts, not just the electricity sector. Because yes, it’s about improving the resilience of the power grid—about which we’ll be writing more in the time to come—but it’s also about advancing complementary measures that get people out of harm’s way to begin with. It’s about climate change, and equity, and infrastructure, and planning—it’s all about the future, and how we best position ourselves to face it.

And that means looking forward, not looking back. So in the time ahead, we’ll be looking to see how the federal government, states, and utilities move forward, and do our best to make sure that when tomorrow’s storm won’t look like today’s, all parties are preparing for the future, not the past.

Florida Power & Light

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs