UCS Blog - The Equation (text only)

Trump Administration Delays Protections for Construction and Shipyard Workers, Weakens Beryllium Rule

More bad news for workers coming from the Trump administration. Last Friday (June 23), the Occupational Safety and Health Administration (OSHA) announced its proposal to “modify” (read “weaken”) protections for workers exposed to beryllium in construction and shipyards.

Beryllium is a very dangerous material. It’s a carcinogen and the cause of chronic beryllium disease, a devastating illness. There’s no real rescue from this slow, incurable, and often fatal lung disease.

While it leaves in place the permissible exposure limit (PEL) for beryllium (0.2ug/m3), OSHA now proposes to eliminate the “ancillary provisions” of the rule that would extend certain protections to construction and shipyard workers. Protections like exposure monitoring, a written exposure control plan, personal protective equipment, and medical surveillance. These “ancillary provisions” are actually basic public health protections for workers dealing with a really hazardous material.

An unwarranted delay

In March, I wrote and bemoaned OSHA’s two-month delay in implementing its new protective standards for workers exposed to beryllium—an unwarranted delay following decades of work and solid scientific evidence. A delay in implementing standards that lowered the PEL and also, importantly, extended needed protections to workers in shipyards and construction. A delay that guaranteed two additional months of serious, ongoing risk to exposed workers.

That two months will now stretch for who knows how long into the future for shipyard and construction workers. In its press release last Friday, OSHA stated that “Representatives of the shipyards and construction industries, as well as members of Congress, raised concerns that they had not had a meaningful opportunity to comment on the application of the rule to their industries when the rule was developed in 2015-16.” Seriously. Even though in a lengthy (very lengthy) rule-making process, OSHA specifically solicited stakeholder comments on whether its final beryllium rule should extend protections to workers in these two industries. Seems like enough time, no?

The latest blow to worker and public health protections

The OSHA press release also noted that “it has information suggesting that requiring the ancillary provisions broadly may not improve worker protection…” OSHA didn’t cite any evidence, except perhaps mentioning the “concern” of the regulated industry.

Public health professionals (and workers) would beg to differ on the value of measuring exposure levels, requiring personal protective equipment, and providing medical surveillance of exposed workers.

This new proposal is just the latest blow to worker and public health protections coming out of the Trump administration—see, for example, here,  here, and here. The proposal is scheduled to be published tomorrow in the Federal Register, which will open the public comment period.

Let’s use the opportunity to remind OSHA—and to send a message to all of our regulatory agencies—that their FIRST priority is protect the health, safety, and security of the American people.  Private interests must not trump the public interest.

American Prosperity Depends on International Science: Our Border Policy Should Reflect That

At first, the new ‘laptop ban’ sounded like a minor nuisance. This is a part of a recent executive order prohibiting large electronics as carry-on items on flights to the U.S. from eight countries in northern Africa and the Middle East. Only when I saw a Facebook outburst from my American colleague in Africa did it become clear how even a small encumbrance like this can cast a devastating blow to science.

This travel restriction is one of several that could fast cripple scientific and technological progress in the US. That is bad for the US economy and the livelihood of its citizens. Here’s why.

Christine is a climate change scientist working in Kenya. She posted to Facebook:

“This latest [Executive Order] just eliminated four out of seven of my major routes home from Nairobi. As a professional scientist, I cannot travel without my laptop. I see devastating impacts on collaborations with professionals from the targeted countries, and those who live in Africa and Asia and use these airports to connect to the U.K. and the U.S.”

Sure, technically she could check her laptop, but would you abandon yours to the potential of being rained on, crushed, stolen, or “examined” by security agents, risking the leakage of personal data and the loss of your primary tool? Obstacles like this, combined with sweeping immigration bans, will steadily reduce our scientific connectivity to the world.

The position of the US as a frontrunner in science is sustained by engagement with the international scientific community. We need foreign partnerships because societies across the globe face a suite of common challenges. Many are interconnected by economies of trade, others by planetary physics. And many of these challenges require science-based solutions that are not resolvable in national isolation. Three examples are climate change, emerging technologies, and sustainable food production.

Climate change

Climate change is a global phenomenon, but the responses of some regions will have greater impacts on future climate than others. For example, tropical forest biology is a driver of atmospheric circulation. The US Department of Energy funds US scientists to travel abroad for tropical research, because biological responses to climate change there have the potential to alter weather, and thereby energy security, in the US.

We need to work with scientists around the world to learn about climate migration and displacement from sea level rise and other climate impacts. Photo: Jason Evans/Georgia Sea Grant

The human response to climate change is another shared problem. The US is far from immune to population displacement by future sea level rise. We would be smart to work with social scientists abroad to learn how climate migrations are being managed elsewhere.

But we cannot simply travel abroad and study at will. Doing my dissertation work in Brazil, I learned that international partnerships are carefully cultivated through fair, reciprocal exchange. If we hassle our foreign partners to hand over their social media passwords upon entry to the US, how welcoming will they be to us?

Emerging technologies

China and India are now two of the world’s leaders in investment in renewable energy. Saudi Arabia and Morocco are funding ambitions for large-scale solar. Each country will be innovating to overcome the significant challenges of production, storage, and distribution that an energy market dominated by renewables faces. The latter two countries have air hubs on the laptop ban list.

As Africa’s tech workforce grows in numbers and ability, other useful technologies are emerging such as mobile-phone banking, and nimble cloud-computing services. These technologies are likely to become imports to the US, just as the crisis-mapping software Ushahidi, originating in Kenya, has been adopted for disaster relief coordination and elections monitoring around the world. It will be difficult to import Africa’s experts to develop similar technologies here if we eliminate skilled worker visas.

Sustainable food

As we deal with drought here in the US, we have a lot we can learn from scientists abroad. Photo: NASA JPL.

Much of our imported food production depends on fossil water—water in aquifers that will not be replenished in our lifetimes. That includes sources in Mexico, our dominant international supplier. Determining the longevity of deep reservoirs is a hard scientific problem. Through international research collaborations, we can aid in predicting the sustainability of water sources on which our food supplies depend, and help develop appropriate farming practices. We can again look to Africa for expertise, where indigenous superfoods are gaining popularity as vegetables that are more nutritious and require less water than our staple European brassicas.

Here again, US scientists may be reluctant to cross the border for collaborative research with Latin American suppliers if we are subject to unlimited laptop and cellphone searches upon re-entry. And Mexican industries may not welcome our scientists if our leaders continue to paint the country in an unfavorable light.

International collaboration promotes science and peace

Just as face-to-face communication with international colleagues fosters trust and begets lasting collaborations, fair and open international exchange cultivates mutual understanding and respect between countries. Our border policies must carefully balance the tradeoffs between restriction and openness. Where possible, we should seek synergies. By facilitating collaboration with other countries on shared problems, we can encourage both peace and expedient solutions.

What can you do to help? Share this post, and present the central concept to your senators and representatives. Share the insight that our country’s economic prosperity and peace depend on international scientific exchange.

I am grateful to my international scientific colleagues for valuable comments on this essay: Dr. Christine Lamanna (American in Kenya) in Kenya; Dr. Bernardo Flores (Brazilian); Dr. Alberto Burquez (Mexican); and Dr. Karen Taylor (American).

 

Dr. Tyeen Taylor studies the shifting ecology of tropical forests amid the onset of rapid climate warming. He avidly shares the joy and practicality of scientific knowledge with non-scientists through films, photography, writing, and public events. Public Facebook page: /TyeenCTscience. Twitter: @TyeenTaylor. YouTube: Tyeen Taylor. Website: www.ttphilos.org

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone. The views and opinions expressed in this article do not necessarily reflect the official policy or position of the Massachusetts Institute of Technology.

A Real Chance to Help Coal Communities—If We Fight for It

On Tuesday the House Natural Resources Committee plans to vote on the RECLAIM Act, H.R.1731. The bipartisan legislation, sponsored by Congressman Hal Rogers (R-KY-5), would free up $1 billion in existing funding from the Abandoned Mine Lands fund and put people to work cleaning up abandoned coal mines. It’s common-sense legislation that uses existing money (did I mention this is NOT a new tax?!) to create thousands of jobs reclaiming degraded mine lands and putting those lands to use in ways that spur local economic development.

Unfortunately, corporate coal interests have launched a last-minute effort to kill the bill.

A key moment

Central Appalachian citizens’ groups have been working for more than two years to organize local support for the bill and find champions on the Hill. A national coalition of environmental groups and labor unions support the bill. Grassroots support has been the driving force behind the bill being introduced last session, and reintroduced at the start of this session—this time with a Senate companion bill sponsored by the Majority Leader. But the current version lacks the requirement that reclamation projects help diversify local economies, which is why supporters are hailing the amendment introduced by Don Beyer (D-VA-8) to reinstate this as a requirement for funding reclamation projects. Here’s a letter signed by more than 40 local and national organizations, including the Union of Concerned Scientists.

Do these investments pay off? In a word, yes. The $90 million pilot project approved at the end of 2015 and again in 2016 funded projects in West Virginia, Kentucky, and Pennsylvania. An agriculture project in West Virginia is a great example of how abandoned mine lands might be put to commercial use.

Given how much our president talks about coal miners, you’d think he’d be on the phone personally calling legislators to pass this bill. Instead, he submitted a budget to Congress that cuts coal communities off at the knees.

What can I do?

Call your member of Congress. Now. Especially if you are represented by one of the members of the House Natural Resources Committee. They need to hear from you, because they’re hearing a lot of negative things about this bill from coal executives.

If you’re not sure who to call, just dial this special line that was set up to help people call in to support the bill: 1-347-269-4100

And your message is simple:

“The RECLAIM Act would create thousands of jobs in struggling coal communities. Vote yes on the RECLAIM Act—and the Beyer amendment—tomorrow. If you don’t serve on the Natural Resources Committee, share your support with Chairman Bishop.”

It won’t take more than 60 seconds. And it could make all the difference.

And, hey, Mr. President, if you’re reading this, feel free to make a few calls yourself—or even just tweet about it. It sure would be a great way to get some of those coal miners “working their asses off” (as you so colorfully put it).

Photo: Wikimedia

Better Ways to Describe the Trump Administration’s Attacks on Science

It is not exactly a secret that these are challenging times for both science and democracy in the US. From attacks on science and science-based policies, to the increasing body of evidence that we may not be able to count on the federal government to protect public health and safety, the days are long, and not just because of the summer solstice. Leading the Center for Science and Democracy at the Union of Concerned Scientists gives me, and my terrific colleagues, the opportunity to be in the middle of the fight to defend the role of science in our country. But there are a lot of things coming at us, and many of them are, to say the least, negative.

The Merriam-Webster dictionary (M-W) has come to our rescue with new ways to say “this sucks.” The increased ability to use descriptive language may save my sanity. So here goes, my first try at using this great new resource:

First on the list, Pessimum, defined by M-W as “the least favorable environmental condition under which an organism can survive.” This seems to fit the conditions for EPA employees to a tee. These dedicated public servants are now overseen by a hostile staff of political appointees with direct ties to regulated industry. Their Administrator not only ignores science but seems to go out of his way to make decisions contrary to the scientific evidence. And while many senior scientists and technical experts are leaving, with encouragement from the Trump administration, it seems nearly impossible for new talent to come on board one of the premier public health agencies in the world. That’s about as unfavorable as conditions can get. In their definition, M-W illustrate with a quote about the Irish population during the Great Famine. Maybe next time they can refer to our poor underappreciated colleagues at the EPA.

Next, Catastrophe, defined by M-W as “utter failure”, which in the 16th century meant “the final action that completes the unraveling of the plot…” Sounds like the President’s budget proposal to me. Reductions in funding across programs AND personnel on the order of thirty percent at EPA, Interior, Energy, NOAA and other agencies, with science programs in the crosshairs. Reductions in grant funding proposed for NIH, NSF, and even cuts at the CDC. It is not even clear what the theory of change is here, other than “unraveling” or destruction. The budget proposal even signals that this administration doesn’t think universities should be able to charge overhead at levels that enable our great research institutions to continue to function and train new scientists. On second thought, maybe catastrophe is too mild a descriptor….

But then there are many uses for the next word in the list, Worstest. Even though M-W views its definition as “a substandard variant of worst”, it seems that the Trump Administration can lay claim to many of the worstest actions in its first six months of any administration in modern times. The program for regulatory rollbacks leaps to mind, including the President’s Executive Order requiring federal agencies to withdraw two regulations for each new one put in place. It is the worstest idea I have ever heard to base the decisions on public health and safety protections solely on costs to industry, with no consideration of benefits to the public. The whole reason for regulations is to protect the public interest, as a recent report from the Center for Progressive Reform so clearly lays out.

Merriam-Webster defines The Limit as “a very annoying and upsetting person or thing.” I have to go with withdrawal from the Paris Agreement as The Limit so far. Annoying and embarrassing for our country, yes. Upsetting? Oh yeah. Backing away from leadership in the world. Reneging on our agreements with the international community. Refusing to face up to one of the major challenges of our generation—that’s The Limit.

The word Putid, M-W tells us, means “rotten, worthless.” That’s a perfect description for the decision by EPA Administrator Pruitt to essentially gut the Board of Scientific Counselors (BOSC) for the agency. These independent scientists, appointed for their expertise, are non-partisan and have great value in helping guide the scientific program of work for EPA. They don’t weigh in on regulatory decisions. They are offering their expertise to make sure the science is strong. But apparently the new administration doesn’t want that advice and has failed to renew their appointments, suggesting they can reapply for these positions. Given that all of the Counselors are highly respected scientists with full-time jobs and were serving this extra duty in order to serve the public, that’s a putid offer.

Maleficent is an elegant word meaning “productive of harm or evil” according to M-W. Perhaps there is no better illustration than appointing lobbyists from regulated industries to oversee regulatory programs in federal agencies. Like the American Chemistry Council lobbyist who is now directly managing the new rules for protecting our families from toxic chemicals. Passing the law was a signature bipartisan achievement of the last Congress. But, the rules are maleficently being weakened, making us all less safe.

Merriam Webster has given us a few more on the list. But I should save those for another time, as I don’t think we have seen the last action by this administration that undermines the role of science in our democracy and causes us to reach for the dictionary.

It is better to end on a more positive note. The M-W also gives us synonyms for “optimistic” such as auspicious, heartening, promising, propitious, and upbeat. And there is so much energy in the scientist community, along with those who care deeply about science, that we can and must fight back. This spring we saw an auspicious beginning for that energy in the March for Science. It is propitious that here at UCS we have seen more and more scientists joining our Science Network. That’s heartening. Later in July, we’ll be reporting on the first six months on the Trump Administration’s attacks on science in more detail. So let’s fight back for our public health, safety and the environment. We can win.

Photo: Freddie Alequin/CC BY-SA 2.0 (Flickr)

Climate Risk in the Spotlight of Chevron’s Annual Shareholder Meeting

Midland sits on the West Texas plains, an art deco mid-rise skyline rising over the broad landscape that stretches as far as the eye can see, dotted with pumpjacks, drill sites, and bright green-blue containment ponds.

I journeyed to Midland to attend Chevron’s annual shareholder meeting, held on May 31st, because I wanted to let the company know the importance of planning for a low carbon future. A resolution on the Chevron shareholder ballot requested that the company issue a report to assess how it can respond to climate change and the transition to a low carbon economy by altering the company’s energy mix or by acquiring/merging with companies that feature low carbon or renewable energy assets or technologies.

I set out to Midland along with Barbara Briggs from the Union of Concerned Scientists (UCS) and Dr. Wendy Davis, a scientist from Austin, to voice our support for proposals that increase corporate transparency with regards to climate change.

Midland

In Midland, petroleum is a way of life. Around the city and even at the airport, the influence of the oil and gas industry is ubiquitous, with many billboards advertising equipment and technology to improve and enhance oil recovery.

On our way into town, we stopped at the Permian Basin Petroleum Museum, which outlines the geology of the Permian Basin, its history from the wildcatters to today, and the role of petroleum in our daily lives.

The 1923 discovery of oil in the Permian Basin shifted what was a small ranching and railroad community into a major hub for the US oil and gas industry. The Permian Basin Petroleum Museum even hosts a Chevron-sponsored exhibit called “Chevron Energy City” that teaches children about various forms of energy.

Dr. Wendy Davis, Stephanie Thomas and Barbara Briggs at the Permian Basin pump jacks display in the Permian Basin Petroleum Museum, Midland, TX.

The Shareholder Meeting

The shareholder meeting itself was a brief affair. After passing through intense security (no purses, no electronics of any kind, no notebooks!), I made it into the building and took my place in a seat in the hall.

Some have asked me why I decided to go to the meeting. I happen to be a Chevron shareholder and a former Chevron employee. I have a background in Earth Science and have both studied ancient climate change and worked as a petroleum geologist.

I currently work with Public Citizen, a nonprofit organization that focuses on protecting health, safety, and democracy. When I heard that UCS was planning to attend the meeting, I jumped at the opportunity to join them. UCS brings a strong, clear voice for science, and I deeply respect their work on climate and beyond.

The week before the shareholder meeting, UCS organized a panel discussion on climate change and risk that highlighted some of the major risks corporations and communities face with regards to climate change.  That discussion confirmed for me that collectively, we need to act quickly.

Initially, there had been two items on the shareholder ballot dealing with climate change.

The first, a proposal for Chevron to report on company plans to deal with climate change-related risks, was withdrawn after Chevron published a report in March entitled “Managing Climate Risk: A Perspective for Investors.”

The second proposal calling on management to report on the transition to a low carbon economy got 27% of of the shareholder vote—a healthy showing that will have the board’s attention.

Another shareholder resolution called upon Chevron to disclose company spending on lobbying.   CEO John Watson’s commentary as he discussed management’s opposition: “We [Chevron] have the right and responsibility to represent our interests.”

When Barbara Briggs asked about Chevron’s relationship with ALEC (the American Legislative Exchange Commission), an organization that creates shadow legislation and has actively denied climate change, Watson upheld ALEC as a “leader” that played a “constructive role” in climate change policy discussions.

From what I heard at this once a year meeting with its shareholders, it seems as though Chevron’s major plan for handling climate change is to focus more deeply on natural gas and efficiency.  CEO Watson even commented that there will be “plenty of time” to respond to “risks” like production decline and environmental regulations. Physical risks, like the risk of storm surge, would be covered under the corporation’s comprehensive risk management plan.

Interestingly, at least half of the meeting was devoted to discussing climate change. And after the results of the ExxonMobil shareholder meeting, it seems like the movement pressing energy companies to plan seriously for a low carbon future is gaining traction.

As the meeting came to a close and people took to the exits, I overheard another shareholder say to a company employee, “With all this talk about climate change, has Chevron looked into hydrogen?”

 

Stephanie Thomas, Ph.D. is an earth scientist, researcher, and organizer with Public Citizen and an advocate for clean energy. She holds a Ph.D., M.S. and B.S. in Earth Sciences from Southern Methodist University, University of Nebraska-Lincoln, and Tulane University, respectively. Follow her on Twitter at @theHouston13.

 

When the Floods are Certain But (NOAA’s) Funds Are Not

Did you know? Arriving tomorrow and continuing over the weekend, our coasts can expect some of the highest tides and most extensive flooding of the year (we’ll see these King Tides again in July and August). The number of days with coastal flooding is expected to be 35% above normal this year, and “normal” already includes a steep increase over historic levels, courtesy of sea level rise. Well underway is an extra-active Atlantic hurricane season.

There are reasons for all of these things, some of which we’ll get to. And the fundamental reasons that we know about these reasons are (a) science and (b) NOAA—our National Oceanic and Atmospheric Administration and one of the nation’s most vital scientific agencies.

Did you know that just fourteen percent of US coastal counties (not states!) produce 45 percent of the nation’s gross domestic product (GDP). NOAA also finds that close to three million jobs (that’s one in 45) are directly dependent on the ocean economy? And the ocean economy (using 2014 numbers) accounted for 149,000 business establishments, 3.1 million employees totaling $123 billion in wages and contributing $352 billion in gross domestic product.

Hey, thanks NOAA, cool to know. And did you know that 180 million Americans  use their hard-earned dollars annually to make 2 billion visits to beaches? In fact, coastal states receive about 85% of the tourist-related revenues in the US.

What happens on our coasts matters greatly to the US economy and, apparently, to our collective sanity.

Wait, there’s more. Did you know our coasts continue to see rapid population growth (39% of US population) and development (1355 building permits issued daily in ten years) including in flood-prone areas, even as sea levels are projected to rise 4, 6, possibly 8 feet this century? (How do we know that? Science and NOAA.) Or that $882 billion worth of homes (almost 1.9 million nationwide) is in reach of sea level rise by the end of this century?

The convergence of these trends creates colossal risks for our country’s economy, for businesses, and for people and their homes. These are tenuous circumstances along our coast that need to be managed carefully if our communities are to grow resilient in the face of such change. But many states and communities are trying to mitigate their risk and prepare for rising seas using data and tools, courtesy of NOAA. Its Digital Coast, for example, connects digital elevation models with sea level rise projections and enables sea level rise risk to be visualized and understood by residents and planners alike. And NOAA’s Coastal Zone Management Grants as well as the Sea Grant Program help provide funding resources to communities and state governments who match these funds to implement planning and projects that reduce flood risks and increase natural habitats.

What does a sensible federal administration do in light of all of the above?

Enhance NOAA’s ability to help coastal states and communities adapt to change, continue our nation’s thoughtful investment in this agency’s vital coastal resilience building resources, and double-down where risk is greatest.

What is this administration doing?

Proposing budget cuts that would gut NOAA’s capacity to do these things and outright eliminate some of these core taxpayer-built assets that provide vital data and build coastal communities’ resilience. Some of these NOAA’s programs that communities are using go back to 1970’s and help communities’ ability to plan, respond to, and mitigate coastal risk events. Here’s a deeper look at one of the many places that will experience sunny day flooding: South Carolina.

Since 1980, South Carolina has been impacted by 51 billion-dollar coast hazard natural disasters causing impacts to the 1,241,048 people who call the 2,876 miles of coastline their home (this includes  offshore islands, sounds, bays, rivers, and creeks). Approximately half (49% or 175, 613) of these people live in the floodplain (coastal and riverine), 54% of these residents are over the age of 65 and to 56% are considered to be lower income or in poverty. In fact, a total of 83,833 coastal properties in South Carolina worth $45 billion could be under water due to sea level rise by the end of this century.

Recent studies indicate that we’ll see more coastal flooding due to sea level rise, which is happening faster than originally thought. At a minimum we are likely to see that 10 cm rise would take place in about 30 years’ time.

NOAA’s grants under National Ocean Service (NOS) and Oceanic and Atmospheric Research (OAR) have contributed greatly to helping make South Carolina’s coastal communities more resilient to these coastal hazards. For example, in one year (from FY 2015 – 2016) in South Carolina, the SC Sea Grant Consortium provided 25 coastal community resilience trainings, assisted 11 communities in implementing sustainable development practices and plans, enabled 6 mayors to develop regional plans for critical natural areas, and activated volunteers to restore 2,681 acres of beach habitat.

In 2017 South Carolina has received 12 grants from the NOAA’s National Oceans Service (NOS) totaling over $25.5 million. Under the President’s FY18 budget, funding under NOS could be cut by a quarter. South Carolina also received 9 grants from NOAA’s Office of Atmospheric Research (OAR) totaling almost $10.2 million, under the President’s budget by one-third.

Under the President’s FY 18 budget, both the NOS Competitive Grants under Coastal Science and Assessment and Coastal Zone Management Grants and Regional Coastal Resilience Grants would be zeroed out as well the Sea Grant funding under OAR including National Strategic Investments, Small Business Innovation Research and state program funding, among other funding opportunities.

But what does that look like on the ground?

Here are a few snapshots of how NOAA is bringing South Carolina shorelines, and it’s communities to life:

  • NOAA Sea Level Rise Viewer showing 4 feet of sea level rise (current projections estimate much higher sea level rise by the end of his century).

    NOAA’s Sea Level Rise Viewer, Regional Coastal Resilience Grants and the City of Charleston, South Carolina: The City of Charleston is one of the hot spots for frequent tidal flooding with a  409% increase in nuisance flooding since 1960’s. Imagine that based on NOAA projections, from 1970 to 2045 Charleston’s tidal flooding will increase 2 days per year to 180 days—the equivalent of experiencing flooding conditions every other day!

  • City officials are working to prepare for the next 50 years with a comprehensive, 5-phase sea level rise strategy with an estimated cost of $154 million and a completion target of 2020. The plan calls for hiring a chief resilience officer, as well as capital improvements that include better drainage systems, raising the elevation of streets, building and extending seawalls, and retrofitting public housing.With NOAA’s Sea Level Rise Viewer, city council members were able to use flood projection maps and realistic visualizations of sea level rise impacts on local landmarks to help inform their flood risk management strategies. Thanks to NOAA, the South Carolina Sea Grant Consortium will have $766,887 in funding through a Regional Coastal Resilience Grant to advance these critical sea level rise resilience and recovery efforts as described in their 2107 strategic plan.

Beachfront Vulnerability Index (BVI) for the southwestern end of Folly Beach, South Carolina. This map shows the ArcGIS Weighted Overlay assessment using data on elevation, long-term erosion rates, number of dunes present, wave height, tidal range, a habitable structure’s proximity to an inlet, and a habitable structure’s distance from the state’s lines of jurisdiction. The results of this assessment established a BVI score for each parcel.

  • Building Resilient Communities Using a Beachfront Vulnerability Index: Just in the last two decades, South Carolina’s eight coastal counties have experienced rapid growth and erosion. In fact, NOAA’s data show that 3,773 square miles of change (17 percent), including a 21-percent increase in developed areas from 1996 to 2010. To help improve their resilience, South Carolina’s coastal zone management program utilized NOAA’s data including elevation, long-term erosion rates, number of dunes present, wave height, tidal range, and a setback line and baseline to develop a beachfront vulnerability index. This index helps planners to assess community exposure and susceptibility and provides a vulnerability score for each parcel along the South Carolina coast. It also incorporates mitigation and adaptation strategies in both local and state beachfront management plans.

    Community-Based and Larger-Scale Oyster Restoration in ACE Basin NERR, South Carolina

  • The ACE Basin National Estuarine Research Reserve (NERR), named after the Ashepoo, Combahee and Edisto Native American tribes (as well as rivers) is located in South Carolina’s St. Helena Sound and is one of 29 reserves that make up the National Estuarine Research Reserve System (NERRS). With funds from NERR Science Collaborative the reserve and state worked together to engage over 1,000 community volunteers to restore two miles of shoreline with vital oyster reefs building coastal communities’ resilience to storms and sea level rise.

Each of these programs requires buy-in at the state and local levels with an in-kind match to the federal investment. The SC Sea Grant Consortium for example leveraged $2,649,008 in funds (FY 2015 – 2016) which is an equivalent of a 433% return on the state’s investment, not to mention $4.9 million in economic benefit and 167 jobs. Thanks NOAA!

As our friends in Charleston, SC and all of America’s coasts witness the sunny day flooding today and the next few days, we know that their minds won’t be on President Trump’s latest tweet or whether he will call them to discount the reality of sea level rise. Instead these communities will be wondering how to continue to survive and thrive as this administration guts the very resources they need to see tidal flooding and other coastal hazards coming, and determine how to respond.

Time to defend NOAA’s budget

Without NOAA’s Digital Coast, NOAA Sea Level Rise Viewer, Coastal Zone Resilience Grants, or the Sea Grant Program, among many others, these communities will be flying blind and patching together funding to keep their sea level rise strategy moving forward, even as they face faster rising seas and more frequent and intense hurricanes.

Now is a good time to reach out to Congress to ask them to keep these places in mind and to resist these cuts to NOAA’s budget. Here and here are more reasons why.

US Census Bureau NOAA Office of Coastal Management, Digital Coast South Carolina Department of Natural Resources

The Case Against Clovis: Why Trump’s USDA Chief Scientist Nominee Is the Wrong Choice

President Trump is dangerously close to violating the law (no, not what you’re thinking!). Recently, word began circulating that the President plans to fill the role of Chief Scientist at the Department of Agriculture (USDA) with…you guessed it, someone who has no scientific background. If the nomination of Sam Clovis—a conservative talk show radio host and former Trump campaign co-chair with a doctoral degree in public administration—moves forward, it would not only be in direct violation of the law, but would risk the safety of our food and water, and the well-being of thousands of American farmers and communities.

One scientist, two hats…

Of the thousands of scientists that work with USDA, many do so to advance agricultural research. USDA invests billions in agricultural research annually—$2.9 billion in FY2016—and that investment is overseen by the Under Secretary of Research, Extension, and Economics (REE).

The REE Under Secretary—the position for which Clovis’ name has been floated—is responsible for disbursing all of these funds through dozens of programs and entities, such as the Agriculture & Food Research Initiative (AFRI), the Sustainable Agriculture Research and Education Program (SARE), and the Organic Agriculture Research & Extension Initiative (OREI), all of which invest in research that supports farmers, rural communities, and consumers.

But that’s not all. The REE Under Secretary also fills the role of USDA’s Chief Scientist. The Chief Scientist is in charge of the Office of the Chief Scientist (OCS), which is tasked with identifying, prioritizing, and evaluating “Department-wide agricultural research, education, and extension needs.” A core component of this work is the responsibility to advance scientific integrity at USDA by “ensuring that research supported by and scientific advice provided to the Department and its stakeholders is held to the highest standards of intellectual rigor and scientific integrity.”

…and three reasons to say no

So, even though Clovis isn’t a scientist, does that make him unfit for the job? According to the U.S. Code, yes! But that’s not the only thing going against his potential nomination:

  1. It would violate the law. The REE Under Secretary is a tremendously important position, responsible for investing billions of dollars into agricultural research that should help U.S. farmers, communities, and consumers. Congress acknowledged this by cementing the following in statute: “The Under Secretary [of REE] shall be appointed by the President, by and with the advice and consent of the Senate, from among distinguished scientists with specialized training or significant experience in agricultural research, education, and economics.” (7 U.S.C. 6971). Yet, what’s known of Clovis’ background demonstrates virtually no “specialized training or significant experience” in any of the relevant fields.
  2. Functions & duties. As written by Congress, one of the primary duties of the REE Under Secretary is to “identify, address, and prioritize current and emerging agricultural research, education, and extension needs.” This task requires a sound understanding of the breadth of agricultural scientific literature, and furthermore, a belief in numbers and facts. Former Secretary of Agriculture Dan Glickman said recently that it would be “challenging” to have someone without a scientific background as REE Under Secretary, and former REE Under Secretary Catherine Wotecki said that the role should be filled by “a person who evaluates the scientific body of evidence and moves appropriately from there.” Yet, Clovis has called even the most basic scientific research into question. In 2014, while running unsuccessfully for an Iowa Senate seat, Clovis twice said he was “skeptical” of the science of climate change (here and here). If Clovis were to take the Under Secretary position at USDA, his skepticism would transform from an ignorant personal belief to an egregious affront to American farmers and rural communities. Because whether he believes it or not, farmers are experiencing the effects of a changing climate every day.  From hotter summers that hurt crop yields, to more extreme rains that wash soils away, to more erratic winters that threaten cold-requiring crops, the obstacles farmers are facing are real. They deserve the attention of someone who understands, rather than dismisses, their challenges. And if, like me, you’re not a farmer, the scientific research supported by the USDA impacts you too. From food safety, to basic nutrition, to water quality – no matter where you live, USDA supported research is finding answers which will lead to a safer, healthier life for millions of American families.
  3. Scientific integrity. The Chief Scientist is responsible for the advancement of scientific integrity at USDA, which recently improved their scientific integrity policy. In April 2017, the USDA Office of Inspector General released survey data in an attempt to quantify what USDA scientists thought of the Departments’ scientific integrity policies. While the survey has recently been removed from the website (you can still find the full survey here), among the findings were 29 scientists (2 percent of those surveyed) who indicated that entities external to USDA had pressured them to alter their work, and 42 scientists (3 percent of those surveyed) who indicated that a Department official had pressured them to omit or significantly alter their research findings for reasons other than technical merit. For an individual with no scientific background or expertise, it can be next to impossible to oversee let alone improve an issue as complex and important as scientific integrity. This is particularly true when that individual has questioned even the most basic science (see #2).

On November 8, 2016, President Trump rode a wave of support from rural America into the Oval Office. Since then, his Administration has abandoned even the most elemental scientific facts. For the rural Americans who helped catapult him to the Presidency, this has become particularly poignant.

Unfortunately, the nomination of Sam Clovis isn’t a solution. It will only make the wound even deeper.

On the Business Case for Renewables (and Why It’s so Strong)

I’m not a CEO and you won’t find me hocking financial self-help books at the airport, but there are three core tenets of business success that I hope we can agree on: keep your product affordable, minimize your exposure to risk, and keep your customers happy.

If you’re an electric utility, sticking to these core tenets is particularly tricky. Fortunately, there’s one investment decision that can keep you on track: renewable energy.

Renewables are cheap today, tomorrow, and for the long haul

The cost of renewable energy resources has fallen dramatically in recent years, making them an attractive investment option for utilities.

The dramatic decline in costs for wind and solar resources is well documented and impossible to ignore. Costs for renewable resources have now fallen to the point that they’re the cheapest new-build generating resources out there, and utilities looking to maintain an affordable electricity supply are taking notice.

Take, for instance, the transition underway in two states here in the Midwest. In May, Michigan’s DTE Energy—the seventh largest utility in the US—announced a forward-looking plan that would eliminate its use of coal and move the utility to a mix of about 40 percent renewables, 40 percent natural gas, and 20 percent nuclear by 2050—cutting the utility’s carbon emissions by 80 percent. DTE CEO Gerry Anderson proclaimed “Not only is the 80 percent reduction goal achievable—it is achievable in a way that keeps Michigan’s power affordable and reliable.”

In Minnesota, Otter Tail Power’s latest plan for meeting energy demand also calls for a shift to renewable energy—more than 30 percent by 2030. And this on the heels of the state’s largest utility, Xcel Energy, receiving approval late last year for its least-cost plan that includes 40 percent renewable energy in the same timeframe.

Both of these plans would take the utility well beyond what’s required under Minnesota law as renewable energy investments are no longer about regulatory compliance, but about keeping electricity prices low in a rapidly changing electricity sector.

Renewables provide certainty in a world of uncertainty

Let’s put ourselves in the utility’s shoes for a moment. We’re trying to figure out how to keep the lights on, make smart investments for our shareholders and customers, and avoid a lot of political drama. And we’re trying to do that in a very uncertain world: how much will fuels cost five or ten years from now? What regulations will come and go? How much demand for electricity will there be?

Where can a utility invest to minimize the risks of an uncertain future?

Renewable energy.

Fossil fuel investments pose a number of risks for utilities because of uncertainty about future fuel costs, regulatory requirements, and energy demand. Renewable energy resources offer a low-risk option for investment, helping to protect investors and consumers.

Utilities are now realizing that renewable energy is a low-risk investment that can help maintain stable rates and avoid unexpected costs down the road. In fact, wind and solar resources are some of the lowest-risk options for meeting electricity demand. With renewable energy, there’s no risk of higher-than-expected fuel prices (because the wind and sun are free), little risk of costs to comply with unanticipated environmental regulations (because wind and solar power have relatively little environmental impact), and little risk of over-investing and being stuck with stranded assets (because wind and solar can be added in small increments and installed relatively quickly).

All of this makes renewable energy an attractive investment for utilities looking to minimize their exposure to risk.

The customers want renewables and the utilities want happy customers

Nearly half of Fortune 500 companies and the majority of Fortune 100 companies now have clean energy goals. In fact, corporate demand for renewable energy is growing rapidly not just to meet sustainability goals, but because companies are looking for the low, stable energy prices that renewable energy provides.

And if the utility can’t provide it, they go elsewhere—signing contracts directly with wind and solar power providers and cutting the utility out of the deal.

More and more large utility customers are seeking access to renewable energy, pushing utilities to ramp up investments in these resources.

Utilities are responding with new products to meet the growing corporate demand for renewable energy. In Michigan, the state’s largest utility, Consumers Energy, recently proposed a new large customer renewable energy pilot program that will allow corporate customers to power their companies with renewable energy.

The new program comes in direct response to a request from telecom company Switch, Ltd., whose new Michigan data center will be powered 100 percent from new wind projects. In its filing, Consumers Energy stated that in a survey of its large business customers, more than half expressed interest in greater access to renewable energy.

More than 15 states now have a way to facilitate corporate renewable energy purchases through their utilities. And it’s a smart move by utilities looking to retain customers and attract new business.

Overall, today’s renewable energy resources are an attractive investment option for utilities and renewable energy’s future in the US continues to be bright. Core business principles remain: provide affordable products, avoid unnecessary risk, and keep your customers happy.

Renewable energy does all that.

Department of Energy wikimedia commons Credit: J. Rogers

7 Things We Expect to See in Rick Perry’s Unnecessary and Biased Grid Study

On April 14, Energy Secretary Rick Perry requested a 60-day “Study examining electricity markets and reliability.” The study was scheduled to be released on June 26, but it now appears it will be delayed until JulyPerry’s letter calling for the study is riddled with flawed assumptions and predetermined conclusions about the value so-called “baseload” coal and nuclear power plants provide to the grid and the impacts renewable energy have on reliability that contradicts overwhelming evidence from dozens of studies by DOE’s own national labs, regional grid operators, and even Perry’s home state of Texas.

Do you think we’re going to learn anything new in 60 days that these experts and real-world experience haven’t already answered over the past decade?

Secretary Perry’s biased and unnecessary study is just yet another blatant attempt by the Trump Administration to prop up the ailing coal industry and undermine important renewable energy policies that are providing clean, reliable, and affordable power to consumers. Renewable energy business associations, Senate Democrats, and even prominent Senate Republican Chuck Grassley from Iowa have raised similar concerns about the study’s motivation and credibility.

What does a credible study look like?

Typically, important tax-payer funded government studies are done in an open and transparent manner over a period of several months or years, with input and review from outside experts, key stakeholders, and the public. This approach helps balance varying viewpoints, avoid political interference, and ensure objectivity.

Here are two examples of DOE studies on renewable energy and reliability that were done the right way:

  • NREL’s 2012 Renewable Electricity Futures Study, a massive 850-page study developed by more than 100 experts from 35 diverse organizations and peer-reviewed by more than 140 experts. The study found that with a more flexible electricity system, grid operators would be able to balance electricity supply and demand and maintain reliability in every hour of the year with renewable energy providing 80 percent of US electricity by 2050.
  • DOE’s 2015 Wind Vision Study, a comprehensive analysis of the costs and benefits of producing 20 percent of US electricity from wind power by 2030 and 35 percent by 2050. More than 250 experts and 50 organizations—representing the wind industry, utilities, grid operators, non-governmental organizations, and four DOE national labs—contributed to the report.
Perry’s study doesn’t meet these standards

In addition to the absurd 60-day deadline, the study is being conducted behind closed doors with no input or review from outside experts or the public. And the research questions (if you want to call them that) have either been answered already or are clearly biased against renewable energy.

To make matters worse, the study is being directed by individuals who have been openly hostile to renewable energy and supported by the fossil fuel industry. Travis Fisher and his boss Daniel Simmons, appointed by President Trump to oversee DOE’s Office of Energy Efficiency and Renewable Energy (which they once recommended eliminating), are former employees of the Institute for Energy Research (IER), and its advocacy arm, the American Energy Alliance (AEA), which actively supports rolling back state and federal climate and clean energy policies. (For more details, see these blogs by Elliott Negin and Dave Anderson.)

In 2015, Fisher wrote a report for IER calling clean energy policies a greater threat to reliability than extreme weather, cyber attacks, or terrorism. To address this so-called threat, Fisher recommended repealing federal renewable energy tax credits, state renewable energy standards, state net metering policies, and the EPA’s Clean Power Plan and Mercury and Air Toxics Standards.

It’s no secret that the Trump Administration is targeting many of these policies. Perry also made a highly controversial comment at a Bloomberg New Energy Finance Conference in late April saying they were having “very classified” conversations about DOE potentially overturning state and local renewable policies in the name of national security.

What we would expect to see in a rigorous study

If Perry’s grid study is done right, here are 7 important things we would expect it to show based on current trends and recent credible studies:

  1. Renewables are diversifying the electricity mix (see pie charts), making the grid more reliable and resilient. Regional grid operators and utilities are already integrating high levels of wind and solar of 50 to 60 percent or more of total electricity demand in some parts of the country, including Texas, while maintaining and even improving reliability.
  2. The national labs, regional grid operators, utilities and others have completed dozens of studies showing that the US can achieve even higher levels of renewable energy in the future, while producing reliable, affordable, and cleaner electricity, as explained in this letter signed by UCS.
  3. Baseload power plants pose their own reliability challenges because of their large size, limited flexibility, and vulnerability to extreme weather events such as the Polar Vortex, extreme heat and drought impacts on cooling water, and storm surge from Hurricanes. This 2013 DOE report highlights numerous climate and extreme weather-related risks to our energy infrastructure.
  4. There is widespread agreement from energy experts that low natural gas prices and flat electricity demand are the main causes of recent coal and nuclear retirements, not renewable energy, as highlighted in new report by the Analysis Group.
  5. Fossil fuels and nuclear power have received far more subsidies than renewable energy historically, and are part of the permanent tax code while tax credits for renewables are set to phase out in a few years.
  6. The costs of utility scale wind and solar have fallen by more than two-thirds since 2009, which has made renewable energy more affordable to consumers.
  7. Federal tax credits and state renewable standards have been key drivers for the cost reductions and recent deployment of wind and solar that are creating new jobs and other economic benefits across America, particularly in states and rural areas that voted for President Trump.
Renewable energy and natural gas are diversifying the US electricity mix

Source: Energy Information Administration

The study, not renewables, is a waste of taxpayer money

If Perry’s study reaches different conclusions, or cherry picks information that supports the Trump Administration’s predetermined conclusions, it should raise a major red flag.  Perhaps Republican Senator Chuck Grassley from Iowa (which gets 36 percent of its electricity from wind) said it best in his letter to Perry: “I’m concerned that a hastily developed study, which appears to pre-determine that variable, renewable resources such as wind have undermined grid reliability, will not be viewed as credible, relevant or worthy of valuable taxpayer resources.”

Science Needs to Learn Lessons from the LGBTQ Rights Movement

The recent March for Science did not help public support for science. That is what the majority of Americans told a recent Pew Research Center survey and what certain news outlets are quick to put in their headlines. My response: Who cares? If my years of organizing for LGBTQ rights taught me anything, it’s that the success of the march should not be measured by the day, but by the movement it creates.

I am a scientist by academic background. However, I spent more time organizing protests and rallies in support of LGBTQ rights than I ever did on my physics homework (and I have the grades to prove it). At one point, I even joined the board of a newly formed local grassroots LGBTQ rights organization. The group had a few very energetic members who were always looking for the next reason to hold a protest in downtown Boston, one of the most LGBTQ-friendly places in New England.

Events like these were incredibly important, but we were not able to single-handedly change the hearts and minds of the country on issues like marriage equality through the cunning use of protest signs. Despite the beautiful artwork and creative slogans, the only people who really saw them were people who agreed with us. Even worse, after spending some time looking at the communication of climate science, I’m fairly certain that our signs would only harden the opposition in their worldview.

Knowing that these protests would be a total waste of time unless it led to direct political action, I organized a volunteer team to go through the crowd at every rally armed with clipboards. Their instructions were to get the contact information for as many people who attended the rally as possible. We then recruited people from those lists to be volunteers on future actions that were focused on political impact.

In the months that followed, we put them to work making phone calls and knocking on doors all over New England. The goal of this effort was to identify registered voters from neighboring states who supported marriage equality and ask them to directly lobby their state representatives. It was part of a broad campaign to win marriage equality throughout all of New England and, five years later, we succeeded.

This amazing feat was not the direct result of any one of our marches or rallies. Those events were simply a catalyst used to build momentum for our cause. The real impact came from the hard work our rallied-up supporters took on in the years that followed.

Tens of thousands of science supporters braved the rain to support the March for Science in Washington, DC. Photo credit: D. Pomeroy

With this perspective, I think it’s fair to declare the March for Science a huge success. Tens of thousands of people braved the weather to show up to a sopping National Mall in Washington DC, stand in a downpour for four hours, and then march through the rain to Capitol Hill. There were also more than 600 satellite marches across the world. Thousands of people showed up in places like Boston, Los Angeles, New York and as far away as Sydney.

Whether or not the March will have impact on public support for science is now left up to what we do with the energy of the crowds we turned out. To be successful we will need to get people involved in every aspect of the movement. We will need scientists to speak out in their local communities to explain the importance of their research. We will need supporters to attend local school board meetings and ensure the next generation receives a science-based education. We will need everyone to go to their local, state, and national legislators and demand evidence-based policy. Some of us may even need to leave the lab and run for office.

Luckily, we are not starting from scratch in this endeavor. I am hopeful that long standing science advocacy organizations, like the Union of Concerned Scientists and the American Association for the Advancement of Science, will be able to team up with newly forming organizations, like the March for Science and 314 Action. Together we can take this momentum forward and make real change. However, it will take time and it will take a sustained effort.

In the meantime, if you’re able to make it to a Pride event this month make sure to sign a petition or two. If an organizer follows up with you, don’t be afraid to take the next step and become a volunteer. Your involvement will not only be good for the cause; it will teach you a bit about political organizing. And, if we’re going to turn the massive crowds at the March for Science into a movement, we’re going to need as many organizers as possible.

 

Dr. Dan Pomeroy received his Ph.D. in physics from Brandeis University in 2012 studying high energy physics as part of the ATLAS experiment at CERN. He then served as a post-doctoral fellow, at the National Academy of Sciences and as a AAAS Science and Technology Policy fellow in the office of Senator Edward J. Markey.  He also has extensive experience in grassroots political organizing, running LGBT rights campaigns as well as field offices during the 2008 elections.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone. The views and opinions expressed in this article do not necessarily reflect the official policy or position of the Massachusetts Institute of Technology.

 

Tropical Storm Cindy Brings Life-Threatening Flood Risks: Atlantic Hurricane Season Arrives Early and Strong

Several cities along the Gulf Coast have declared state of emergency in preparation for tropical storm Cindy, which will likely bring extreme precipitation and pose life-threatening flood risks. Here’s what to watch for to stay safe during this storm.

The greatest danger, according to the National Hurricane Center, is that Cindy could produce life-threatening flooding across portions of the Gulf Coast.

The Atlantic tropical storm and hurricane season started early in 2017 with Tropical Storm Arlene in April.

It is quite rare for a tropical storm to form in the tropical Atlantic Ocean east of the Antilles “Main Development Region” before July 1.  Yet, Tropical Storm Bret is the third to do so in over a century in a half.   Already Tropical storm Bret has brought high winds and rain to Trinidad, Tobago, Margarita and dissipated near Bonaire, Curaçao, and Aruba.

Now for the first time in nearly half a century two named storms occurred simultaneously in the Atlantic before July: Bret and Cindy.

Precipitation can become dangerous Cindy precipitation forecast for Louisiana region NWS

Projected Tropical Storm Cindy precipitation totals issued June 21, 2017 at 7:48 AM Central Daylight Time. Source: National Weather Service New Orleans/Baton Rouge LA.

The National Hurricane Center declared, soon after Tropical Storm Cindy formed, that flooding is the primary hazard for the U.S. Gulf Coast.

In parts of Louisiana, Mississippi, and Alabama maximum precipitation totals could be as high as eight inches in places with torrential rain stretching from Texas to the Florida panhandle.   This is less than a year after the historic amount of rainfall in Louisiana during August 2016 dumped more than three times as much rain as Hurricane Katrina in 2005 leading to record flooding in several rivers.   Unfortunately, residents in South Louisiana are still digging out from past catastrophic flooding.

Watch the factors influencing coastal water level: wind, tide, waves, freshwater contribution

Look for the projected wind characteristics in places of most concern.   Winds whip up the waves and can increase damage.

The direction and severity of the winds really matter in a spiraling tropical cyclone.  Note that the highest potential for storm surge typically is away from the center of a tropical cyclone track.  Winds that push water onshore can pile up water locally.  Also winds that push water offshore can keep high tide lower than projected.

Watch for the possible coinciding occurrence of local storm surge peak with high tide.   Check the phase of the moon and other conditions that can lead to a higher (or lower) than normal tide.

Know if a “King tide” – among the highest tides in a year – is expected during a storm.  Gulf coast tidal ranges are typically smaller than King tides that have occurred along U.S. East Coast locations.    Watch for local National Weather Service flood projections that take account of the local wind, tide, waves, and freshwater contribution to local water level.   For example, today the water levels are far above normal near Houston, TX, at tide station located at Galveston Bay Entrance, North Jetty TX.

 NOAA

High water conditions June 21, 2017 at Galveston Bay Entrance, North Jetty TX near Houston. Source: NOAA

Watch for improved tropical cyclone forecasts with new images coming from satellite GOES-16

The scientific community is gushing over the early images from satellite GOES-16 (experimental phase) for tropical storms Bret and Cindy.  This most advanced weather satellite that NOAA has developed is planned to be positioned as GOES-EAST after being declared operational in November.  In that capacity it will cover the main development region for Atlantic tropical cyclones and the entire continental US.

#GOES16 captured this brilliant loop of Tropical Storm Cindy (formerly PTC3) in the Gulf this afternoon! Forecast @ https://t.co/LFPB6b4IJu pic.twitter.com/lwjwSnOWfO

— NOAA Satellites (@NOAASatellites) June 20, 2017

Watch leadership and budgets of key agencies GOES-EAST satellite

GOES-16 satellite will be positioned as GOES-EAST after being declared operational. Source: NOAA

Investment in US science agencies research, instruments, and operations have improved Atlantic tropical storm and hurricane tracks.  Fewer people have to evacuate, Mayors and State Governors have more advanced notice to warn citizens, and emergency responders have more time to get ready.

Watch for the budgets of NOAA, NASA, DOI, NSF and FEMA, and if necessary, tell Congress that it must ensure these agencies have the resources, tools, and can retain top talent.  So all Americans in the path of tropical storms and hurricanes are given the most advanced and accurate warnings to stay safe, and that emergency preparedness and recovery efforts are well-funded. Robust leadership in key agencies is also critical. FEMA administrator Brock Long was just confirmed yesterday, however, NOAA and NASA still do not have administrators.

If you live in the path of Cindy, please take heed of local advisories and stay safe. I’ll be updating this blogpost in the days to come.

weather.gov/neworleans NOAA/NOS/CO-OPS NESDIS/NOAA

Want to Squash Science? Follow Pruitt’s Lead at the EPA

The Trump Administration double downed on its overhaul of the EPA’s Board of Scientific Counselors (BOSC) this week, notifying members whose terms would be ending in 2017 and early 2018 that their terms will not be renewed and cancelling subcommittee meetings for the rest of the year. This means that just 3 out of 18 executive committee members and 11 out of 49 subcommittee members will remain, with just 10 days to reapply to their positions.

This will stall the work of the committee, which was set to begin looking at the EPA’s Office of Research and Development’s (ORD) plans for the next five years. According to the Board’s chair, Deborah Swackhamer, it’s an inauspicious sign of what’s to come for other advisory committees with members whose terms are coming to an end, like many members of EPA’s Science Advisory Board.

Scientific advisory process is already open, inclusive, and diverse

Scientific advisory committees, like BOSC, provide independent advice to the federal government, allowing agencies to seek outside expertise on critical issues. The EPA has seven advisory committees that are scientific or technical in nature, including the Science Advisory Board (SAB), the FIFRA Scientific Advisory Panel (SAP), the Board of Scientific Counselors, the Clean Air Scientific Advisory Committee (CASAC), the Chemical Safety Advisory Committee, the Environmental Laboratory Advisory Board, and the Human Studies Review Board.

Most recent membership includes a wide range of expertise, of scientists from a variety of backgrounds whose affiliations range from an academic setting to the private sector.

As appropriate, a diversity of expertise is represented on these advisory committees, the work of which ranges from reviewing epidemiological studies on a particular pesticide to advising the EPA on a contentious scientific review, like fracking impacts on drinking water.

Advisory committees should be made up of a balance of expertise, not just because it’s required under the Federal Advisory Committee Act but because it promotes a breadth of opinions, an opportunity for creative solutions to complex problems, and challenges members to come to a consensus on certain charge questions.

The EPA’s advisory committees are also very transparent. All members serving on these committees must sign ethics forms, disclose funding from the past two years, and be prepared to submit a waiver to the administrator if there is a clear conflict of interest explaining how it will be mitigated.

This is about science advice, not politics

A representative from the American Chemistry Council told the Washington Post that the emptying out of BOSC was welcome and would address their “concerns in the past that EPA advisory boards did not include a diversity of views and therefore frequently presented a biased perspective on issues before them.” But this criticism doesn’t make sense for the BOSC.  Its executive committee is composed of scientists with diverse research expertise, equipped to give advice on ORD’s research agenda without making policy prescriptions where a bias would come into play.

I want to reiterate something that the BOSC’s chair repeated several times in a hearing before the House Science Committee in May. She told members of congress that “robust science, not politics, should form the bedrock” of environmental policies and that all of the members who could have been renewed for a second term had been fully qualified, vetted, and ORD-recommended.

There was no justification for their dismissal. Swackhamer was rightfully offended by the implication that BOSC members would display allegiance to the administration under which they are appointed, telling the Star Tribune that, “I resent the fact that we are considered biased because were appointed during [President Barack] Obama’s tenure,” she said. “I would behave the same way if I were appointed by Pruitt.”

Independent science advisors should be judged based on their qualifications to make scientific recommendations, which don’t change just because there’s a new EPA administrator. Now instead of working to help ORD decide how in the world they will deal with likely cuts to several of its research programs, BOSC will be incapacitated for at least the next year, which according to one of the subcommittee members who resigned last month, Peter B. Meyer, will mean that “guidance to shape the coming agenda will be lost so cost-effectiveness of research will suffer, as will science.”

Of all of the EPA’s advisory committees, BOSC is the last one I would expect to be caught up in a political fight. And I’m still worried about the future of EPA’s other important science advisory committees. Let’s remember that the SAB often works on policy-relevant issues, and its current nomination process is already stalled this year.

Every year since 2007, the EPA has put out a call for nominations around the same time in April. Except this year, that call never came, despite there being 15 slots opening up this September that need to be filled. Likewise, CASAC’s chair’s term is ending in September and the EPA has so far done nothing to recruit for her replacement, even though without her the committee will be unable to function. EPA staff has reported that Pruitt has a draft notice for nominations on his desk, but he hasn’t issued it yet. Without CASAC’s chair, the committee will be unable to issue recommendations on soot and sulfur oxides that the EPA relies on for further air quality rulemaking.

The crucial role of independent science advice

Delaying the work of these advisory committees is another example of this administration’s apparent philosophy of death by delay. The inability of CASAC to continue advising the EPA on its National Ambient Air Quality Standards means that the EPA will go without independent scientific advice on the six criteria pollutants of ozone, particulates, carbon monoxide, sulfur dioxide, nitrogen oxides, and lead. Changes to BOSC means that ORD will have less input on how to make cost-effective changes to its research programs while encouraging important scientific development. And if the SAB gets treatment similar to BOSC, SAB’s ongoing work to review EPA’s water quality criteria to protect aquatic life and to peer review EPA’s toxicological assessments of several chemicals, including ethyl tert-butyl ether will languish.

We must protect federal advisory committees, for they are the objective body that every agency needs in order to take on some of the most difficult questions facing our country today. The work that these advisory committees do feeds into the crucial research conducted by and the safeguards issued by the EPA. Advisory committee members volunteering their valuable time to public service to advise agencies on how best to protect us and the environment are unsung heroes. We should acknowledge and praise them more often.

It’s a shame that Administrator Pruitt is doing the opposite: actively working to offend committee members by questioning their objectivity under a new administration and devaluing their work by shifting committee composition and cancelling committee meetings. Gutting our government’s capacity for independent science advice is an attack on science. We will be keeping a watchful eye on the appointment of new BOSC committee members to make sure that scientific expertise and research experience, not political views, are the basis for qualification.

Sea Level Rise and High-Tide Flooding Outlook Make It to NOAA’s Climate Update

On June 15, the National Oceanic and Atmospheric Administration (NOAA) held its Monthly Climate Update press conference, in which it releases the global temperature for the previous month. The big piece of information in this press conference usually comes on the very first slide of their presentation, which includes the measured global temperature for the month, and how much it deviates from the 20th century average of 58.7°F.

This month, however, something interesting happened: NOAA didn’t have the numbers, and mentioned right off the bat that they would be released the following Monday (June 19), hence this blog being published later than usual (for those of you who keep tally!).

Anyway, the month of May was the third warmest on record, at 1.49°F above the average, and the global average temperature for January–May 2017 was 1.66°F above the 20th century average of 55.5°F, making it the second highest January–May period in the 138-year record, behind only 2016 by 0.31°F.

Back to the press conference: it went through the usual motions of precipitation, temperature, and drought data, and future projections for those. But another thing was different: this month included a briefing on 2016 coastal flooding and sea level rise, and a future outlook for 2017.

NOAA 2017  high tide flood “outlook”

Sea levels continue to rise

NOAA and the National Weather Service have for a long time issued reports and alerts on flooding, especially related to coastal storms. They publish these updates when needed, on their website and through other communications means – we need them, and do use them whenever needed. We can always count on NOAA to keep us informed of weather and flood dangers.

What I am talking about here is different. While seasonally this press conference has included information about snow pack, or fires, or groundwater, in my memory it is the first time it includes a sea level rise report on flooding, specifically high-tide flooding – and an outlook for that.

According to William Sweet, NOAA oceanographer, flood frequency trends are increasing – and faster with time, meaning the rate of increase is itself increasing – due to sea level rise, and we have seen up to 1000% increase since the 1960’s.

In the briefing, Sweet explained that “seasonal high tides and minor wind events now cause high tide flooding in many locations”, and flooding in 2016 followed or surpassed the increasing trend – 130% higher on average compared to 1995. 2016 saw 2 locations break their records in number of coastal flood days: Charleston and Savannah, with 50 and 38 days respectively. Key West tied its record with 14 flood days. And the 2017 outlook calls for as many as 30 days above the trend in some East coast locations such as Lewes DE and Atlantic City NJ – in other words, this year is expected (for various reasons) to see flood frequencies greater than those expected with the sea level rise trend only.

So, in addition to reporting on record temperatures, precipitation and drought, and releasing their usual outlooks for those, NOAA briefed us on record flood. Even if that is not a regular feature, it has to perk up one’s ears.

Tidally driven coastal flooding is one of the most visible signs of sea level rise

As residents of Lewes, DE (and other locations along the US coast such as Miami Beach, FL and Norfolk, VA) say, “if you don’t think sea level rise exists, come visit us”.

Those are places that are currently dealing with sunny day flooding that disrupts their everyday lives to a point well beyond “nuisance” – which is what these types of floods used to be called.  People can’t get around whole neighborhoods, school buses cannot make their stops, and trash cannot be picked up when waters are high.

According to studies done by UCS (see US Military on the Frontlines, Encroaching Tides, Surviving and Thriving), the future, at least in terms of high-tide flooding, is not very bright for many locations along the East and Gulf coasts. Some areas that currently see about 10 flood events per year (such as Norfolk, VA) are projected to see as many as 280 by mid-century, and that with a conservative sea level rise scenario. This does not bode well for the millions of people who live on coastal communities, to say the least.

Future sea level rise scenarios offer a range of risks we must prepare for

As mentioned by Sweet, sea levels are rising faster since the end of last century, and NOAA even released a new set of sea level rise projections that will be used in the next National Climate Assessment (for a full review of the current sea level rise science, go here).

The new projections account for more, faster ice sheet loss, which is becoming increasingly plausible – both Arctic and Antarctic. And while we do not know how much of that ice will melt in the next years or decades, we know that some WILL melt. We must prepare for whatever range of sea level rise is plausible in the future.

Preparedness is at the heart of risk reduction. Lives and property can be saved with effective preparedness and pre-disaster mitigation measures. In order to prepare as best we can, we need the science coming from NOAA and the National Aeronautics and Space Administration (NASA), and we need the programs from the Federal Emergency Management Administration (FEMA) (more on these agencies, their programs, and their budgets here, here, and here). We cannot afford to lose funding for these agencies, as they are the ones minding our safety when it comes to sea level rise and other climate-related impacts. Can you say hurricane season?

While we cannot stop sea level rise right now – we are committed to a certain amount of rise simply because of the emissions already in the atmosphere – we can slow down the rate of rising in future years by reducing emissions of global warming gases now.

Internationally, the Paris Agreement is our big hope.  Here in the US it’s a different story: as the Trump administration seeks to leave that agreement, actions by state and local governments speak even louder – and will have results faster – than federal intentions. Let’s keep the momentum and stay safe, my friends!

The Summer of Floods: King Tides in June, July, August…

Here in San Francisco, we’re celebrating the 40th anniversary of the Summer of Love with art installations, walking tours, and magic buses. But artists in Charleston, South Carolina, are documenting a very different sort of season: a Summer of Floods.

The National Weather Service has issued ten coastal flood advisories for Charleston, South Carolina, in the last three months and is likely gearing up to issue another. Just a few weeks after king tides sloshed their way through coastal towns from Salem to Honolulu, NOAA is projecting more tidal flooding late this week for every coastal region with the exception of the Gulf. And South Carolina is expecting king tides for nine of this year’s twelve months.

Tidal flooding has become more frequent

Once a rarity, since the 1950s the frequency of tidal flooding has increased 10-fold in places such as Atlantic City, Annapolis, and Baltimore and 5-fold in Charleston, Norfolk, and Philadelphia.

“King tide” is a colloquial term that typically refers to the highest of high tides, which occur when the moon is at its closet point to Earth and the Earth, moon, and sun align. But given how frequently such tides are occurring these days, we may need to introduce some new terms.

What to expect during king tide events

If this year’s past tidal flooding events are any guide for what’s in store over the next few days, here’s what coastal residents can expect this week:

Flooded neighborhoods.

King tide flooding Harleston Village, Charleston, South Carolina, in May 2017.

Flooding like this is widespread in South Carolina’s Lowcountry. According to the state’s official , floods like these wrap their way around churches and parking lots and send employees wading through intersections on their way to work.

Less access to waterfront parks and attractions
In places like Miami Beach, restaurant owners report seawater bubbling up through drains and flooding their businesses. And while tourists head to Waikiki for prime ocean access, the photos above suggest they’re getting more than their money’s worth.

Closed roads.

Like the groups that still gather today in Golden Gate Park, we at UCS have been beating this drum for years as king tides have become a common occurrence for coastal residents. As one of the most visible signs that our climate is changing, this week’s floods are a reminder that sea level is rising.

More king tides expected throughout summer and fall

We’d love to see your photos of king tides this week or to hear about your experiences in the comments!

Oh, and mark your calendars because king tide flooding is expected again July 22-24 and August 20-22. And then in September, we enter the typical king tide season, which lasts through the end of the year. As The Washington Post’s Alexandri Petri put it so brilliantly: “The beach is coming.”

Luke Schimmel; MyCoast.org

An Important Step to Clean Air and More Equitable Communities in Los Angeles

By Joel Espino and Jimmy O’Dea

Tomorrow, LA Metro, the second largest transit fleet in the United States, will decide what types of buses to purchase through 2030. The decision will impact Los Angeles’ efforts to clean the air, fight climate change, and expand economic opportunity. We applaud the proposal put forward by Metro staff last week to transition the entire fleet to zero-emission vehicles.

LA Metro can be a leader

Today, Metro’s 2,200 buses operate entirely on natural gas. While natural gas was a better option than diesel when Metro began switching fuels more than 20 years ago, it no longer deserves the “clean” branding seen on Metro’s buses. Advances in technology have made electric buses an even cleaner and viable option. It’s time for Metro to continue its leadership in fighting pollution and transition to the cleanest technology available today: electric buses powered by renewable energy.

Earlier this year, a coalition of bus riders, labor groups, and public health groups launched a campaign urging Metro to be a leader and transition to an all-electric bus fleet powered by renewable energy. A central part of this campaign is that communities most affected by poverty and pollution should be first to reap the benefits of bus electrification, such as improved air quality and more high-quality, skilled jobs. Mayor Garcetti recently urged Metro to make this transition by 2030 and just yesterday, the Los Angeles Times expressed its support for Metro’s path to zero-emission buses.

Despite years of work and improvement, Los Angeles’ air still ranks among the worst in the country. Heavy-duty vehicles like buses are a major source of air pollution.  Today, residents of communities like Wilmington or Bell Gardens, who live near highly trafficked roads and freight corridors, suffer the consequences of air pollution like increased risks of lung and heart disease and premature death.

Last fall we found that electric buses result in far lower air pollution and global warming emissions than natural gas buses. Electric buses have zero tailpipe emissions, cut global warming pollution, and create new jobs. They are better for bus riders, bus drivers, and communities with heavy traffic and severe air pollution.

Our analysis found the potential for good jobs in manufacturing of electric buses, construction of charging infrastructure, and maintenance. With the right training and hiring practices, this industry could bring an economic boost to communities most in need.

Electric buses are the cleanest

There are two types of electric buses Metro could purchase; both have significant benefits. Battery electric buses have 70 percent lower global warming emissions than natural gas buses. Fuel cell electric buses have 50 percent lower global warming emissions than natural gas buses. That includes the emissions from producing electricity and hydrogen. Both types also cut smog-forming emissions in half compared to today’s natural gas buses. As we generate more of our electricity with clean sources like solar and wind, electric buses will be even cleaner.

Electric buses also have lower life cycle emissions than the newest “low-NOx” natural gas buses fueled with biomethane from waste sites such as landfills. Capturing fugitive methane emissions from sources of waste is an important strategy in reducing California’s global warming emissions and can help displace natural gas use in vehicles, yet the limited amount of biomethane available from sources of waste could meet just 3 percent of California’s natural gas demand.  This resource should be used prudently across California’s economy.

The technology is here and ready

Electric buses fueled with hydrogen have had ranges over 200 miles for many years and battery electric buses recently passed this mark. With fewer moving parts and durable electric motors, maintenance costs are lower for electric buses. Electric buses can also accelerate and climb hills as well or better than diesel or natural gas buses.

Metro’s bus investment would boost the regional economy, including at least eight electric bus and truck manufacturers in the LA region, and spur job training in underserved communities, creating a workforce capable of accelerating electrification in other areas of transportation.

Metro can’t switch to electric buses overnight, but as it retires natural gas buses it should replace each with a clean, quiet electric bus. Nearly 20 transit agencies across the state have stepped up to the plate and begun incorporating electric buses into their fleets, many with significant, if not full, commitments to zero-emission buses. California and its poorest and most polluted communities depend on it.

This blog post originally appeared as a joint op-ed at https://laopinion.com/2017/06/20/tipo-de-buses-que-comprara-la-metro-afectara-los-angeles-por-anos/.

Joel Espino is Legal Counsel for Environmental Equity at The Greenlining Institute. Jimmy O’Dea is a Vehicles Analyst in the Clean Vehicles Program at the Union of Concerned Scientists.

Wind Yesterday, Today, and Tomorrow

Young by global standards, Boston is still one of the oldest cities in the United States. It has a fascinating and well-preserved history, with monuments, museums, and plaques everywhere you look. At the same time, it is a center of research and innovation, investigating the technologies that will shape our future. (Okay, I’m biased – I do love this city.) That dichotomy, respecting the past while looking towards the future, is also the story of wind power.

For Father’s Day, I went out to the Boston Harbor Islands with my family. We had a picnic on Spectacle Island, with a great view of Boston.  The weather was perfect.

As it happens, the Tall Ships were in town. While aboard the ferry, we could see a number of the sailing ships docked along the waterfront, and more of them going in and out of the harbor.

Tall Ships. Source: www.sailboston.com.

This brought to mind a paper I had written on energy transitions in the United States. One of my observations was that the United States in 2010 used six times as much wind power per capita than it did in the Golden Age of Sail. That was a few years ago, so the numbers have changed since then. Let’s take another look.

Wind in the Golden Age of Sail

Through the late 19th century, wind was a significant energy resource for the United States. Sailing ships conveyed goods and people up and down the coast and across the Atlantic Ocean. Sailing vessels took fishermen out to sea and back home again. Mechanical windmills pumped water and ground grain. Massachusetts was a hub of the shipbuilding industry, constructing naval vessels like the frigate U.S.S. Constitution and clipper ships from Donald McKay’s shipyards, as well as the fishing boats that set out from Gloucester, New Bedford, and Cape Cod.

The first US steamship appeared in 1807, and steam gradually took over a larger share of nautical propulsion. Steamships accomplished this technological transition through diffusion, starting in specific high-value niches (such as river ferries) where their advantages justified their higher cost, then spreading to more applications as their performance improved and cost declined. We see the same pattern for the spread of electric lighting, or of solar power. Elon Musk explicitly invoked this pattern of technological diffusion with Tesla’s original Master Plan, beginning in small but high-value niches and branching out.

However, sailing ships did not disappear overnight; they continued in use for decades. Some of the ships you might see at a Tall Ships event are either replicas of or inspired by “clipper ships,” designed in the 1850s to operate in one of sail’s remaining niches, fast long-distance transport of high-value cargoes such as tea or spices. Prior to the resurgence of wind power in the 1990s, wind power reached its greatest utilization in the US around 1860 (in absolute terms) or 1810 (in per capita terms).

In 1860, the U.S. population was about 31 million. The nation had about 100,000 windmills and a sailing fleet of 4.5 million tons. I calculated that the energy harnessed from wind was around 5.65 petajoules; in the units of the day they might have noted it as 2 billion horsepower-hours.

On a per capita basis, wind power contributed 67 horsepower-hours (equal to 50 kilowatt-hours, although at the time the only use of electricity was in telegraph batteries). Compared to other sources, in 1860, wind power in total was greater than power from watermills; less than that obtained from draft animals; and roughly equal to the power output from human labor or to that of coal-fueled engines (in locomotives, steamships, and factories).

Output of Mechanical Work (Motive Power) by Resource, 1780-1880. Source: O’Connor and Cleveland (2014).

Wind was not the largest source of motive power, but still a significant one that accomplished tasks other energy resources could not.

Wind today

Steam engines continued to move into more applications, until diesel engines came to dominate marine transport in the 20th century. Sailing vessels became limited to small recreational craft. Windmills for water pumping peaked around 1920 or 1930, and declined after that, although small wind turbines for electricity generation appeared in some rural areas.

Wind power, though, has made an astounding comeback in recent years. Increased deployment supported by state and federal policies led to rapidly declining costs and improved performance. Wind turbines and solar panels together provided 0.07% of US electricity in March 1997, nearly 1% in March 2007, and over 10% of US electricity in March 2017, most of that from wind.

Wind turbines on a farm. Source: www.awea.com.

In 2016, wind power generated 226,872 million kilowatt-hours of electricity. The Census Bureau estimates that the population of the US on July 4, 2016 was 323,148,587. Therefore, wind power in 2016 provided about 700 kilowatt-hours per capita. Some wind energy is still harnessed directly—like by the Tall Ships and water-pumping windmills—but most of the wind energy we use today comes from wind turbines. The per-capita wind power contribution is now about 14 times what it was in 1860.

Wind Energy Inputs to U.S. Economy, 1790-2016. Source: Author’s calculations.

I find that pretty remarkable.

Wind tomorrow

What does the future hold for wind power? Well, it won’t grow its share tenfold in the next ten years, but its continued expansion seems likely.

Many regions have successfully integrated wind power into their electricity systems at relatively low cost, utilizing a combination of forecasting, turbine controls, geographic distribution, and grid flexibility. What were once considered difficult levels of wind to incorporate are now seen as simple. Taller turbines may enable wind power to spread in the Southeast.

Offshore wind, widely used in Europe, is now (finally) on the move in this country, too.  Although some construction costs are higher, the environment allows for installation of much larger turbines that would be difficult to transport to sites on land. Larger turbines can access winds that are both stronger and more constant at higher altitude. New Bedford, a hub of the old wind industry of sailing ships, might become a hub of the new wind industry, with potential jobs in offshore wind turbine construction  (subscription required).

A strong base, smart policies, technological advances, and a skilled workforce: wind will continue to provide clean energy, jobs, rural economic development—and power for sailing. Even if some of the new sails don’t quite fit in a Tall Ships event.

The “Skysail” system can offer annual fuel savings of 10-15% for freighters. Source: www.skysails.info.

 

How Many Rides Do Lyft and Uber Give Per Day? New Data Help Cities Plan for the Future

In the span of about 7 years, app-based ride-hailing (i.e. Lyft and Uber) has gone from non-existent to ubiquitous in major metro areas. But how are these services affecting important aspects of our transportation system like congestion, public transit, and vehicle emissions?

The San Francisco County Transportation Authority (SFCTA) made a big first step last week towards answering these questions. The agency released data showing when, where, and how many rides start and end within San Francisco.

These statistics are important because passenger vehicles are the largest source of climate emissions in California, a major source of air pollution, and play a central role in our transportation system, which greatly affects social equity. If ride-hailing continues to grow, it has the potential to positively or negatively impact many aspects of transportation, including the reliability of public transit; costs of travel; extent of air pollution and climate change; safety of pedestrian and vehicular travel; and accessibility, type, and quality of jobs.

Lyft’s recent commitment to provide 1 billion miles of travel in autonomous electric vehicles powered by 100 percent renewable energy by 2025 is an encouraging step towards a positive future of app-based travel.

Some of the report’s findings are what you’d expect

Not surprisingly, the number of rides within San Francisco peaks in the heart of downtown on Friday and Saturday nights. During the week, ride-requests are at their highest during the morning and evening commutes. More rides are requested after work than before work. Interestingly, more rides are also requested as the work week progresses, #fatigue?

SFCTA developed a website to visualize when and where rides are starting and ending in San Francisco. It’s pretty cool, especially if you’re familiar with the city.

Switching from pick-up to drop-off location (see gifs), gives a rough sense of where people are traveling to and from, i.e. commuting to downtown in the morning and out of downtown in the evening. SFCTA’s data doesn’t correlate the pick-up and drop-off locations of individual rides, but the aggregate data still suggests these trends.

Other findings are less expected

The most surprising numbers from SFCTA’s report are the sheer volume of rides being given by Uber and Lyft: more than 150,000 intra-San Francisco trips per day, which is roughly 15 percent of all vehicle trips taken within the city and more than ten times the number of taxi trips.

The SFCTA study only considered trips originating and ending within San Francisco. So, there are actually many more Uber and Lyft trips being taken to or from the city.

Another interesting finding: approximately 20 percent of the miles traveled by Uber and Lyft drivers in San Francisco are without a passenger. These out-of-service miles (also known as “deadheading”) are actually lower for Uber and Lyft than taxis, which drive 40 percent of their miles without a customer. More Ubers and Lyfts on the road compared to taxis mean less distance is traveled between drop-offs and pickups.

What’s the big deal?

If you asked, “Don’t Uber and Lyft already have this data?” You’d be right. They do. So does the California Public Utilities Commission (PUC), which oversees transportation network companies (TNCs) – the policy term given to Uber and Lyft.

But the TNCs and PUC denied requests for data, so SFCTA partnered with Northeastern University to indirectly measure it themselves. Uber and Lyft oppose sharing data that could reveal aspects of their market share, such as where they dispatch drivers and pickup riders. Because there are only two main ride-hailing companies, either company could just subtract out their own numbers from aggregate data sets to get a sense of what the other company is doing.

The companies have a competitive history, but the need for this type of data will only increase as they provide larger fractions of vehicle trips, especially if projections materialize for ride-hailing with self-driving cars. Without data, it will be difficult to justify the potential safety, mobility, and emissions benefits (or consequences) of self-driving vehicles.

It’s fair to ask whether Uber and Lyft should share data not necessarily required by other fleets. A notable exception is the New York City Taxi and Limousine Commission, which approved standards earlier this year requiring TNCs to report trip information taxis were already required to share.

Even simple metrics such as the types of vehicles in a fleet (electric, hybrid, conventional), as reported by taxis in San Francisco, are important pieces of information for local governments to address the climate and air quality aspects of transportation. As the saying goes, you can’t improve something that you don’t measure.

What’s next?

SFCTA’s findings raise many questions about what types of trips TNCs are replacing. Are they getting people out of personal cars or turning pedestrians into ride-hailers? Are they eroding public transportation or making it easier for people to get to the bus, MUNI, or BART? Are people taking solo rides or sharing trips via uberPOOL or Lyft Line?

Previous studies and those underway are attempting to answer these questions. But ultimately, data like those from SFCTA are critical for transportation planners and researchers to understand the impact of ride-hailing services today and how they can be used to improve, and not hinder, how we get around in the future. Decisions like expanding roads vs. setting aside land for public spaces or how to better serve a community with public transportation all depend on knowing when, where, and how many trips we’re taking, whether by foot, bike, car, bus, or train.

Note to the Department of Energy: The Grid Has Changed

The electric grid is steadily evolving to incorporate growing levels of renewable energy, and it’s saving consumers money and maintaining reliability. However, an April memo from US Energy Secretary Rick Perry is critical of this fact and seeks to protect old plants from closing due to competition.

Grid reliability depends on continued investment, innovation, and modernized practices. New, renewable solar and wind power plants and new practices that replace the old are meeting the public’s electricity needs. Delaying the retirement of old plants costs consumers money. With these changes to grid, there is no change to the vigilant attention to reliability.

When the DOE releases the study and policy recommendations requested by Secretary Perry, it should show that US electric system has not been diminished by change. The study should show that reliability is defined across many criteria and time windows, and that a growing diversity of resources are capable of providing reliability services. If done well, the report will reflect the industry practice of relying on the mix of generation, and that no single metric describes the reliability of the power supply.

Renewable energy from wind and solar has grown to supply as much as 50% or more of electricity on particular days over large areas of the United States.  Individual companies and cities have set and met goals of procuring 100% of electricity use from renewable energy. These early indications of the change in the electric grid demonstrate the robust ability of the technology, investment, and coordinated operations that provide a successful functioning of the grid on new terms.

The challenge for the Department of Energy is to keep up with the changes that make the grid more reliable, despite the decline in coal. Utility engineers have taken up the challenge to use existing tools, such as power system forecasting and coordinated economic scheduling of power plants, and modernize them.

In addition, rapid advances in renewable generators technology provide surprising capabilities. The recent report made by the California ISO staff to its board describing the accuracy and speed of solar farms to provide reliability services was filled with positive exclamations. This is just the latest technical assessment to demonstrate that making more use of the services available from renewable energy for reliability improves both the economics and the reliability of the grid.

How this happened and how the energy system stays reliable

Dynamic innovation and capital investments push the evolution of energy sources, especially in generating electricity. Competition between coal and hydropower factored in the famous rivalry between inventors Thomas Edison and Nikola Tesla at the start of the electricity era (circa 1890). Today, market competition from lower-priced gas, efficiency, wind and solar is driving the decline of coal in the US and corresponding increased use of these supplies. Texas leads the US in wind installations, through a combination of competition and cost-effective infrastructure investment.

Growth of wind & solar in the US in recent years is greater than other technologies or fuels.

Lower prices for these energy sources combined with technical innovations ensures the continued reliable operation of the electric system with this growth and change. As falling costs from new competing sources of energy attract investors, the construction of new power plants using new technology is dramatic. The majority of generating capacity added in the US in each year 2014, 2015 and 2016 used either wind or solar. See the chart to the right for the rise in renewable energy installations using wind and solar.

The organizations responsible for reliability are fully engaged in this evolution

Grid operators, also known as “power pools” are responsible for reliably managing this growth across multi-state regions, with some seeing these changes faster than others. Graphs below show the growth of wind in the major electricity markets from 2008 (when nation-wide windpower (or wind and solar) surpassed 20,000 MW) to 2016.

Independent system operators that operate these markets, as well as plan and operate the grid, are informed of these changes, and the retirements of old plants, through various study obligations that supplement fundamental reliability standards. The MidContinent Independent System Operator (MISO) estimates over $350 million annual savings comes from planning for wind.  The benefits from $3.4 billion in transmission construction in the Southwest Power Pool (SPP) from 2012 to 2014 are $240 million per year, expected to exceed $10 billion over 40 years.

Power pools serve two-thirds of US consumers as system operators independent of the local utilities, fostering reliability, innovation and competition. The agreement to form the PJM power pool to save money and increase reliability was front page news in 1927. Energy needs for wartime aluminum production drove the formation of SPP eight days after the United States entered World War II. Today MISO estimates annual savings of $1.8 billion from reducing needed reserves.

Reliability oversight and sharing of best practices comes from national and continental-scale organizations. US reliability and interconnection standards are developed by the Federal Energy Regulatory Commission (FERC) and the North American Electricity Reliability Corporation (NERC). FERC first adopted a requirement for wind contributions to grid reliability in 2003, setting a standard for “riding through” (i.e. staying connected) disturbances that was more stringent than that applied to nuclear plants. From that time on, NERC has provided a series of reports and recommendations to guide the industry in safe and reliable integration of renewable energy. With this oversight on reliability, the growth of renewable energy has reached some impressive records.

Records for supply from renewable energy

Regional record for use of renewable energy in a single hour. Chart UCS.

Wind farms, and all renewable generation in California, are setting new records for serving regional electricity needs. Adaptations by grid operators over the years allow a steady increase in renewable energy on the grid. The numbers shown here illustrate how wind and solar technology, combined with the grid operators’ tools, are running the electric supply at times with 50% wind in the Great Plains and at 80% with the combination of renewable sources (including wind, solar, hydro, biopower and geothermal) in California.

 

These records are during hours when renewable production is high and demand is relatively low, but they provide experience for routine operations with ever-higher levels of renewables.

Grid practices keeping pace make these records possible

The industry continues to expand the innovations and tools for reliable operations with higher levels of renewable energy and lower levels of fossil fuel.

When wind farms and solar generation are distributed across large areas, the energy produced is both more predictable and steady. This geographic diversity allows regional power pools to integrate the supply of renewable energy into the larger supply mix, as weather patterns move across their region. The effect of this pooling wind across a large area was noted by ERCOT’s official market monitor, who observed wind production in June 2016 was at all times at least 3,500 MW.

Because they’re so large, power pools also create cost savings by reducing the need for generators held in reserve (used to balance supply and demand). Smoothing out these changes, adjusting for weather forecasts, and now incorporating centralized wind forecasts for the region are all best practices. The California ISO implemented wind forecasting in 2004. The regional grid operators of Texas, New York ISO (NYISO), and the Midcontinent ISO (MISO) implemented wind forecasting in 2008 and PJM did so in 2009.

State-of-the-art wind forecasting predicts the output of individual wind farms and allows grid operators to include wind farm operations in their day-ahead preparations and real-time generator dispatch systems. Grid operators continue to improve technical forecast tools, including visualization of wind conditions to improve system operators’ situational awareness, and increased forecasting electrical system needs and capabilities based on forecasted wind output.

Once forecasting demonstrated significant cost savings and reliability benefits, grid operators and the wind industry adopted the best practice of expanding market system control of dispatch, and implementation of wind dispatch (i.e., windfarms respond economically to instructions).  In 2009, NYISO was the first to include wind offer prices and dispatch in the market system. By 2011 the markets run by MISO and PJM included similar price-based wind integration. This innovation allows ISOs to determine the most cost effective way to address reliability issues, ensuring better utilization of wind plant output while maintaining a secure, reliable system.

In addition to new power plants and grid practices, other investment categories contribute to greater reliability and more use of renewable energy. Increased transmission allows greater sharing of energy resources within and among power pools. Utilities, independent developers, and wind farm companies continue the expansion of this most fundamental electricity infrastructure. Coordination of demand response, electric vehicle charging, and simple upgrades such as thermostats and efficient lighting reduce the stress on the grid, directly and immediately improving reliability.

The utility industry has great potential to improve this sort of interaction with consumers, as well as the game-changing possibilities of battery energy storage.

Methods need to continue to evolve and be adopted

The evolution of modern electric grids has reached the point where old coal plants are retiring and grid management proceeds without any coal generation. Both New England, and Britain (old England) have reached this point. Continued progress with new technologies requires support for research, demonstration, and deployment. New methods of understanding requirements and the solutions come from open and honest dialogue.

That competition for the future cannot be successful when the regulatory agencies champion a backward-looking approach. The recommendations anticipated shortly from the DOE should recognize the realities of changes already made in electric grid operations, as well as the capacity to make greater use of new technologies already demonstrated.

Heat Waves and Wildfires Signal Warnings about Climate Change (and Budget Cuts)

Southern California and the Southwest US are experiencing a significant heat wave this week. More than 29 million people in California alone are under an excessive heat warning or heat advisory.

If you live in areas affected by this heat wave, please follow health advisories to stay cool, stay hydrated, and stay safe. And watch for wildfire advisories while you’re at it.

Heat waves are dangerous

Extreme heat can cause heat exhaustion, heat stroke, or even death. Symptoms to watch for include dizziness, headaches, nausea, muscle cramps, and loss of consciousness. Be especially vigilant for children, the elderly, those with pre-existing health conditions, those who work or play outdoors, and your pets. (For more on how to stay safe in extreme heat, refer to guidance from the CDC.)

Unfortunately, climate change is increasing the frequency and severity of heat waves. According to the EPA:

“Nationwide, unusually hot summer days (highs) have become more common over the last few decades. The occurrence of unusually hot summer nights (lows) has increased at an even faster rate. This trend indicates less “cooling off” at night.”

Furthermore, heat waves that arrive earlier in the summer can have worse health impacts because people’s bodies have had less time to adjust to the warm weather. And the longer a heat wave lasts, the more severe the cumulative effects can be.

On the other side of the world, India has already experienced a serious early heat wave in April, and recent research shows that even a small increase in global average temperature (which is very likely with climate change) is projected to cause a huge increase in heat-related deaths there.

Hotter, drier conditions also raise risks of wildfires

The wildfire season in the Southwest is also underway. Many of the same areas experiencing this week’s heat wave—including parts of Arizona, New Mexico, and California—are also forecast to have an above-normal wildfire risk this month (see map).

That’s no coincidence: in many parts of the world hotter, drier conditions are also contributing to growing risks of wildfires.

Arizona currently has more than 12 active wildfires and the state has already seen dozens of fires this year. California has also seen a number of wildfires over the past month; officials warn that the risk continues to be high. Ironically, winter precipitation in these states has helped provide more fuel for fires, stimulating the growth of brush and other vegetation that is now drying out in the hot temperatures.

Halfway across the world, Portugal is experiencing terrible wildfires, where more than 60 people tragically lost their lives this past weekend after getting trapped by raging fires. The country is, of course, focused on the emergency response and is in a state of mourning. Unfortunately, Portugal has been experiencing bad wildfires seasons year-on-year. Earlier this year, Chile also experienced devastating wildfires.

Drought and extreme heat are important contributing factors in all these cases, and frequently faulty forest and land management policies are also implicated.

Managing the risks of wildfires

Wildfires are inevitably a consequence of several factors, including the weather, winds, and the condition of forests and underbrush, plus the proximate causes such as lighting or human activities. Here in the US and many parts of the world, climate change is making hotter, drier conditions more likely and worsening the risks of wildfire.

Development in wildfire-prone areas also exposes more people and property to the risks of harmful impacts. The smoke from wildfires can also impose harmful health impacts on people living hundreds of miles away—recent research shows that the air pollution from wildfires is significantly higher than previously understood.

To manage wildfire risks and impacts, we will have to work on solutions on all these fronts.

Cutting the Forest Service budget is a bad idea

Given what we know about these growing wildfire risks and the need to take robust action to protect people and healthy forests, the Trump administration’s proposed cuts to the US Forest Service budget are a particularly bad idea. For instance, the president’s FY 2018 budget proposes to cut funding for forest health management by about $9 million relative to FY2017 (more specifically, relative to the FY 2017 annualized Continuing Resolution level), which would reduce the resources available to cope with disease and pest outbreaks that kill trees. The hazardous fuels management budget would take a hit of $20 million—meaning that there would be less money to manage or thin forests to reduce wildfire risks near where people live. The budget also proposes to cut funding for volunteer fire departments.

Last week Tom Tidwell, Chief of the USDA Forest Service, testified about the budget before the Senate Committee on Energy and Natural Resources. At the hearing, there was bipartisan push-back to these cuts. Senator Murkowski (R-Alaska) said:

“While some of the agency’s recommended budget cuts are worth considering, others, like the proposed cuts to recreation programs, are concerning. Some could impact critical forest management activities, like firefighting and hazardous fuels reduction. And some appear to contradict other proposals in the budget, so we will need to review all of these very carefully, as we work on our budget for the next fiscal year.”

And Senator Cantwell (D-WA) said:

“President Trump’s proposal reduces funding for fighting wildfires. This budget proposes a decrease of almost $300 million for fighting wildfires and another decrease of $50 million for preventing wildfires.

A way forward on wildfire and climate policy?

Senators Murkowski and Cantwell have a long history of working together to find solutions for improving forest management and fixing wildfire budgeting.

I hope Congress will reject the harmful budget cuts proposed by the Trump administration, and step up and pass legislation to address these critical issues as soon as possible. People who live in wildfire-prone areas—whether in California, Arizona, Alaska, or Georgia—cannot afford further delays or back-sliding.

We also have to continue to work with the global community to limit the heat-trapping emissions that are driving climate change and worsening the risks of deadly heat waves and wildfires worldwide—despite the Trump administration’s stance on the Paris Climate Agreement.

 

A Priority List for Trump’s New FEMA Chief

This evening Congress is expected to confirm the new Federal Emergency Management Agency (FEMA) Administrator, Brock Long, who will serve as the principal advisor on emergency management under President Trump and John F. Kelly, the newly appointed Secretary of the Department of Homeland Security. In my previous blog, I mentioned 5 reasons why this was welcome news.

Once Mr. Long is in place, here’s a priority list to ensure he hits the ground running.

#1 Commit to Incorporating the Latest Climate Change and Future Conditions into All of FEMA’s Actions

Losses from natural disasters (hurricanes, wildfires, flooding, earthquakes, etc.) are on the rise due to growing populations and urban development in high hazard areas, as well as climate change which is increasing the frequency and intensity of extreme weather events.  Thanks to NOAA, we know that in the first quarter of 2017, the U.S. has had 5 weather and climate disasters with over $1 billion in damages, while 2016 had 15 “billion-dollar” natural disaster events (each event either reached or exceeded $1 billion in losses) and was a record year for inland flooding disasters.

While President Trump has signaled to the world his disregard for science, climate change and making our  planet a cleaner and safer place by pulling out of the Paris Agreement on June 1, there is still a lot of work that our federal agencies can do to reduce the impacts of natural disasters, including those worsened by climate change.

  • At the national level, Mr. Long should implement the Technical Mapping Advisory Committee’s (TMAC) Future Conditions Risk Assessment and Modeling recommendations to FEMA on how to utilize and incorporate the best available climate science and methodology to assess possible future flood risk. Mr. Long must also defend FEMA’s budget to Congress to make this possible.
  • At the state level, Mr. Long ought to play a leadership role to ensure that states comply with FEMA’s policy to update their State Hazard Mitigation Plans (SHMP’s) to consider future conditions, including the projected effects of climate change, in addition to substantially improving these plans. States must update their hazard mitigation plans every 5 years to be eligible to receive federal funding for pre-disaster mitigation (PDM). Recent reviews of the quality of these plans (here and here) find that most states need to substantially improve their plans, particularly land- locked states. FEMA is responsible for providing both “technical assistance and reviewing state activities, plans and programs  to ensure mitigation commitments are fulfilled” and does so on an annual basis (for more information see State Mitigation Plan Review Guide).Mr. Long should work with FEMA staff to encourage states to substantially improve their SHMPs which will help to better coordinate state level hazard mitigation actions and safeguard communities.
  • At the community level, Mr. Long can continue FEMA’s leadership on promoting community resilience through building on and expanding upon FEMA’s smart climate adaptation efforts like using flood-resilient design building codes, maintaining natural and beneficial functions of floodplains, investing in more resilient infrastructure, engaging in mitigation planning and strategies to increase community resilience, and including environmental analysis in the Benefit-Cost Analysis (BCA) Tool when considering mitigation activities.
#2 Defend FEMA’s budget

Jamestown,CO, October 9,2013–Deane Criswell,FEMA Deputy FCO, and Dan Alexander, FEMA Region 8 Deputy Regional Coordinator look at map which shows the course of the flood waters in and around the Jamestown, CO area. FEMA is working with local, state, volunteer and other federal agencies to provide assistance to residents affected by flooding. Photo by Patsy Lynch/FEMA

Mr. Long should defend FEMA’s budget and champion the need for robust funding for FEMA’s PDM grant program as well as flood risk mapping. The Trump administration’s FY18 budget proposal would obliterate both programs. As my colleague said in her blog, these cuts would seriously undermine our nation’s ability to prepare for and recover from disasters, and put the safety of Americans at risk.

It remains a concern as to whether Mr. Long will adequately defend the FEMA Budget.

During the June 7 Senate Homeland Security & Governmental Affairs Committee hearing, Senator McCaskill (MO) spoke to the impacts of the recent flooding on many counties in Missouri and then asked Mr. Long whether he “was concerned about the $600 million cut in FEMA’s budget?”  Long replied to say that he supports the President’s budget. However, he also said he would work with FEMA to make sure the agency can meet its needs.

While Mr. Long’s remarks may leave an impression that he will have an open dialog on what those needs may be, families and communities and a broad range of sectors will want a more definitive answer that he will be defend FEMA’s budget, particularly the mapping and pre-disaster mitigation programs.  Funds in the pre-disaster mitigation program can, for example, go toward helping communities implement buyout programs of houses that have been repeatedly flooded in high risk areas and leaving the area as permanent open space, reducing future costs.

The Pre-disaster Mitigation (PDM) Grant Program, authorized under the Stafford Act, is vital to communities and state and local governments as it provides them with funding to implement measures before a disaster strikes instead of afterwards.  A Congressional Budget Office (CBO) study found that the majority of grants and funds go towards flood-risk mitigation measures (vs. planning or other risks such as earthquakes, wind, etc.) and that investing in mitigation before a disaster event helps to reduce future losses. Under President Trump’s budget proposal, this program would be funded at $39,016,000, a 61% cut compared to the FY17 continuing resolution levels.

What is particularly disheartening about this budget cut is that the PDM grant program is already underfunded as more communities apply for funding than can be funded under the allocated amounts.  The Government Accountability Office (GAO) found that post disaster assistance can be a reactionary and fragmented approach and from fiscal years 2011-2014, FEMA allocated a whopping $3.2 billion for post disaster hazard mitigation under the Hazard Mitigation Grant Program (HMGP) and approximately $222 million for the PDM grant program.   A more recent review of pre and post disaster funding by Kousky and Shabman finds that almost 90% of FEMA funding on flood risk reduction comes in the aftermath of a big flood and recommends the need for an increase in funding for pre-flood risk reduction program budgets.

Mr. Long should also make the case for investing in flood hazard mapping and risk analysis program which the Trump Administration’s FY 18 Budget proposal zeros out. Accurate flood risk maps are vital for communities to assess their risks and take action to reduce them. For FY 2017, Congress appropriated $178 million for flood mapping compared to the $400 million per year that was authorized in 2012 by the Biggert-Waters legislation. The Association of State Floodplain Managers (ASFPM) in their Flood Mapping for the Nation report estimated the cost of remaining unmet mapping needs to range between $4.5 billion – $7.5 billion and a spending level of $400 million per year, FEMA could update the maps in 10-11 years.

 #3 Champion existing pre-disaster mitigation policies and robust coordination across federal agencies and state, local and tribal governments: FEMA’s coordination role is a critical one.

Belle Harbor, N.Y., Aug. 17, 2015– John Covell, NY Sandy Recovery Office Director (3rd from left) and NYC DOT Commissioner Polly Trottenberg (2nd from right), inspect the sand dune and baffle walls built to protect the neighborhood from future storms. They were there with local politicians and neighborhood groups for the groundbreaking ceremony of the FEMA funded Street Reconstruction project that will address damage caused by Hurricane Sandy. K.C.Wilsey/FEMA

Mr. Long should champion commonsense pre-disaster mitigation strategies and collaborations that are already underway by:

  • Ensuring the common sense Federal Flood Risk Management Standard (FFRMS) that requires all federally funded infrastructure (such as hospitals, roads, transit systems) in flood-prone areas be constructed to better withstand the impacts of flooding moves forward. A Pew poll found that of the Americans polled, a whopping 82% support such a requirement for the construction of new infrastructure, as well as for repairing and rebuilding structures damaged by flooding. This support shouldn’t really be such a surprise as it aligns well with what states and local governments are already doing; here’s a list of participating communities by state.
  • Move rulemaking on the Public Assistance (PA) Deductible swiftly forward to give states incentives to increase their investments in resilience to natural disasters and to reduce the burden on federal taxpayers after a natural disaster. The PA program provides funding for local, state, and tribal governments to help communities recover from major disasters. We provided extensive comments in support of and to strengthen FEMA’s proposal for such a deductible. The reason this policy is innovative is that it gives states the option to reduce their deductible through credits earned for qualifying statewide mitigation measures that increase resilience and ultimately lower the costs of future disasters.
  • Further FEMA’s progress in concert with the federal inter-agency Mitigation Flood Leadership Advisory Group (MitFLG) on a National Mitigation Investment Strategy (NMIS). After Hurricane Sandy, the GAO in their 2015 report identified a need for a coordinated, federal government-wide investment strategy for resilience and mitigation that reduces the nation’s exposure to future losses from disasters. The NMIS is a great opportunity to help the federal family plan and justify budgets and resources that invest in mitigation measures before a disaster happens.
#4 Advocate for substantial NFIP reform to increase climate resilience and safeguard communities

This is a busy time for Congress as each side of the aisle is working on how best to reauthorize the National Flood Insurance Program (NFIP) before its expiration in September, 2017.  Congressional leaders are working in both the House (see House Financial Services Committee Memorandum that includes 7 different draft bills and that would reauthorize NFIP for 5 years) and the  Senate (see the “SAFE NFIP Act” 2017 that would reauthorize NFIP for 6 years and the Cassidy-Gillibrand (FIAS Act) 2017 that would reauthorize NFIP for 10 years).

Previous NFIP reforms occurred in 2014, 2012, and 2004 and yet many fixes and improvements are very much needed.

Mr. Long must play a leadership role to help Congressional leaders draft a NFIP reauthorization bill that is substantially improved to increase climate resilience and safeguard communities.

Time to hit the ground running…

We are hopeful that Mr. Long will commit to strong leadership on these funding and policy initiatives and that he will demonstrate results.  As Hurricane season is in full swing and as the costs of more frequent and intense natural disaster events grow, we look forward to working with Mr. Long to moving this priority list forward, checking each of the “to do” boxes and ultimately creating a more climate-ready, resilient nation.

 

DOT FEMA

Pages