Combined UCS Blogs

Oil and Gaslighting: The American Petroleum Institute Misses the Mark on Environmental Justice

UCS Blog - The Equation (text only) -

Last month, the American Petroleum Institute (API) made a feeble attempt at refuting the findings of the latest report from the National Association for the Advancement of Colored People (NAACP) and the Clean Air Task Force, “Fumes Across the Fence-Line: The Health Impacts of Air Pollution from Oil & Gas Facilities on African American Communities.” The report highlights the disproportionate risk of health problems facing Black communities in proximity to oil and gas facilities.

Specifically, the study found that more than one million Black people live within half a mile of natural gas facilities and in counties where the cancer risks from natural gas facilities emissions are higher than the Environmental Protection Agency’s level of concern. The analysis considers risks associated with oil and gas facilities, while recognizing the various other sources of pollution with which Black communities are burdened. In a post on the API site, Uni Blake, a toxicologist by training and current scientific adviser to API, attempted to discredit the work of environmental justice groups and to diminish the environmental health issues facing communities of color.

Objectivity, that old chestnut

Ms. Blake used the age-old qualifier of “I am an objective scientist” to denigrate the very real problems facing Black communities. She claims that “…the paper fails to demonstrate a causal relationship between natural gas activity and the health disparities, reported or predicted [my emphasis], within the African American community.” In other words, objectivity for API means refusing to acknowledge the importance of traditional knowledge from local communities in developing research questions. “Objectivity” to API means instilling doubt about links between pollution from industry operations and negative health impacts. This is not new for API, which has for decades been in the business of spreading disinformation on science, but it is still appalling.

Racism in environmental health and scientific methodology

Racism in scientific research has always been an issue. Prejudices abound in research questions and methodology, and even the way results are interpreted. As the NAACP astutely captured in their report, “The nature of the vulnerability of African American and other person of color fence-line communities is intersectional—subject to connected systems of discrimination based on social categorizations such as race, gender, class, etc.”

Ms. Blake is clearly not coming from a place of understanding of the historical and current inequities placed on Black Americans, which is inexcusable since environmental injustices have long been documented. An extensive and expanding body of scientific evidence finds that people of color and those living in poverty are located more often in communities—termed environmental justice communities or overburdened communities—that are exposed to disproportionately higher levels of environmental pollution than Whites or people not living in poverty (See here, here, here, here, *pauses to take a breath* here, here, here, here, here, and even here).

Ms. Blake’s piece lacks actual evidence to back her claims.  Here are a couple of Ms. Blake’s arguments, followed by my comments:

  • “…attacking our industry is the wrong approach and detracts from the real work that should be done to reduce disparately high rates of disease among African Americans.”

Leslie Fleischman, a Clean Air Task Force analyst and study co-author, clarified the goal of the report: “The data in our report looks at the cancer risk and health impacts of ozone smog among the population and so, if that population is more vulnerable because of these factors, then it is even more important to address aggravating factors that are easily avoidable like controlling unnecessary leaks from oil and gas infrastructure.” It would seem that API is detracting from the purpose of the study, mischaracterizing what was demonstrated by the NAACP study.

  • “Rather, scholarly research attributes those health disparities to other factors that have nothing to do with natural gas and oil operations—such as genetics, indoor allergens and unequal access to preventative care.”

All I hear is: “Well, if you changed your lifestyle, had more money, and didn’t have such weak genes, you wouldn’t have health problems! Our operation has nothing to do with your inability to handle the hazardous air pollutants.”  The report she linked to maintains that “Asthma has a strong genetic component, although for this to be manifest interaction with environmental factors [emphasis mine] must occur.” Somehow, she interpreted this to mean “your house is too dusty and you have mold, these health problems are your fault alone,” thereby ignoring the contributions of oil and gas industry on health impacts to the substandard environment people are then forced to endure, as well as putting the onus on the very communities she recognizes as having “socio-economic factors that contribute to the disparities” (e.g. lower income, less access to health care).

As for preventative care, here’s a thought—oil and gas facilities need to quit finding loopholes and touting their successes meeting the bare minimum standards (not to mention constantly pushing for less stringent rules or delaying them). From that same study, “Exposure to airborne allergens and other irritants [emphasis mine] both triggers asthma attacks and is associated with the development of chronic asthma in infants.” Why not investigate some of those other irritants, API?

Same old stuff, different day

This tactic of diverting attention from the real issue is not new to us in environmental justice research and advocacy. We’ve even seen it recently with pushback from the Delaware Department of Health on our recent collaborative report. In October the Union of Concerned Scientists—working closely with the Environmental Justice Health Alliance, Delaware Concerned Residents for Environmental Justice, Community Housing and Empowerment Connections Inc., and Coming Clean—released a report that demonstrated the potential link between environmental pollution and health impacts in Delaware’s industrial corridor. We analyzed Environmental Protection Agency (EPA) data for risk of cancer and respiratory illnesses stemming from toxic outdoor air pollution, as well as proximity of communities to industrial facilities that pose a high risk of a major chemical release or catastrophic incident.

In November, Karyl Rattay from the Delaware Division of Public Health wrote an op-ed personally attacking UCS and the report findings. She said, “The most common risk factors for cancers are related to lifestyle behaviors (e.g. tobacco use) and genetics.” So, lifestyle choices carry more weight than environmental factors when determining cancer risks? Not only is this argument a baseless racist victim-blaming line, but it shows that Dr. Rattay failed to even understand our study.  The study, like the NAACP study, focused solely on environmental risks to communities. Of course, we all know that our health is determined by a wide range of factors. That’s why it is so important to study the contribution of environmental exposures. Otherwise, such risks are often (and have historically been) overlooked.

I urge Ms. Blake and API to stop blaming the communities living in the danger zone. It’s time for the oil and gas industry to take accountability for their polluting ways and honor their commitment to protect the health and safety of the communities where they operate.

A Toxic Nomination Hangs in the Balance: Who Will Stand Up to Finally Topple Michael Dourson?

UCS Blog - The Equation (text only) -

Photo: www.epw.senate.gov

Three weeks ago, North Carolina’s Republican senators, Richard Burr and Thom Tillis, announced their opposition to the nomination of Michael Dourson to run the office of chemical safety in the Environmental Protection Agency. Only one more vote is needed to doom his nomination, assuming unified opposition from all 48 Democrats and Independents.

The question is, who will have the courage to step forward next?

It should take no courage at all, if science and public health matter. Dourson is already in the EPA, serving as an adviser to Administrator Scott Pruitt. But, given Dourson’s outrageous record of working to undermine science-based standards for toxic chemicals on behalf of the chemical industry, he is clearly unfit to lead the office overseeing chemical safety at the federal level.

Belittling the health effects of dangerous chemicals

Dourson’s private research firm has represented companies such as Dow, Monsanto, and PPG Industries, and has had some research funded by Koch Industries.

Michael Dourson helped set a state safety standard for the chemical PFOA 2,000 times less strict than the level deemed safe by the EPA. Photo: pfoaprojectny

He and his firm have routinely judged chemicals to be safe at levels hundreds of times greater than the current standards issued by the EPA. Among those chemicals whose health effects he has tried to belittle is perfluorooctanoic acid (PFOA), which is used in the manufacture of nonstick cookware such as Teflon and stain-resistant household products such as carpets. Dourson helped the state of West Virginia set a safety standard for the chemical 2,000 times less strict than the level deemed safe by the EPA.

That decision alone threatens the health of many Americans. In 2012, research by scientists at Emory University found workers at a West Virginia DuPont PFOA plant were at roughly three times the risk of dying from mesothelioma or chronic kidney disease as other DuPont workers, and faced similarly elevated risks for kidney cancer and other non-cancer kidney diseases. A more recent study, published in the International Journal of Hygiene and Environmental Health linked reductions in exposure to PFOA across the country to a sharp decline in pregnancy-related problems including low-birth-weight babies.

In North Carolina, an as-yet-unregulated chemical meant to replace PFOA as a non-sticking agent, Gen X, has already been found at significant levels in the Cape Fear River. And the state is still reeling from nearly 1 million people being exposed to drinking water at Camp Lejeune that was contaminated with chemicals such as benzene, vinyl chloride, and trichloroethylene (TCE) from the 1950s through the 1980s. The Obama administration established a $2.2 billion disability compensation program for Camp Lejeune veterans suffering from cancer.

Serious concerns from North Carolina

Expecially troubling, if confirmed, Dourson would be responsible for oversight of the 2016 Toxic Substances Control Act. In its final months, the Obama administration selected the first 10 chemicals to be reviewed under the new act for their “potential for high hazard.” Of the 10, Dourson has claimed in research that several were safe at levels far exceeding the science-based standards currently established by the EPA. They include solvents linked to cancer such as 1,4 dioxane, 1-Bromopropane, and TCE, the latter of which has been found in the water contamination at Camp Lejeune.

The Senate Environment and Public Works Committee advanced Dourson’s nomination to the full Senate in late October on a party-line 11-10 vote. But the candidate’s past was too biased for Burr and Tillis, despite the fact that both voted to confirm EPA Administrator Scott Pruitt. Burr said of Dourson in a statement, “With his record and our state’s history of contamination at Camp Lejeune as well as the current Gen X water issues in Wilmington, I am not confident he is the best choice for our country.”

Tillis’s office seconded that with a statement saying, “Senator Tillis still has serious concerns about his record and cannot support his nomination.”

Issues of great importance in Maine

In the immediate aftermath of Burr’s and Tillis’s rejection of Dourson, it seemed that Maine Senator Susan Collins might quickly follow suit. She said, “I certainly share the concerns that have been raised by Senator Burr and Senator Tillis. I think it’s safe to say that I am leaning against him.”

Collins has said nothing since then. Her office did not respond to repeated requests this week from the Union of Concerned Scientists on her latest position. And Dourson’s nomination stands in limbo, presumably as the Republican leadership worries that they may not have the votes in the full Senate to confirm him. In theory, Collins’ concerns should mirror those of Burr and Tillis because Maine has dealt with its share of water and soil pollution at military bases such as the former Brunswick Naval Station and Loring Air Force Base, both Superfund sites. She has also been active in bipartisan efforts to deal with cross-state air pollution.

Collins was the only Republican to vote against Pruitt’s nomination to run the EPA. Pruitt, who repeatedly sued the EPA on behalf of industry as attorney general of Oklahoma, is aggressively attempting to relax chemical regulations and reverse Obama-era rules such as the Clean Power Plan. The EPA has moved to remove products containing PFOA from being studied for lasting impact in the environment and refused to ban the pesticide chlorpyrifos, linked to damaging the developing brains of fetuses and young children.

When she announced her opposition to Pruitt, Collins said, “I have significant concerns that Mr. Pruitt has actively opposed and sued EPA on numerous issues that are of great importance to the state of Maine, including mercury controls for coal-fired power plants and efforts to reduce cross-state air pollution and greenhouse gas emissions. His actions leave me with considerable doubts about whether his vision for the EPA is consistent with the Agency’s critical mission to protect human health and the environment.”

If Collins truly maintains those concerns, she surely would not want to augment the problems of Pruitt’s already disgraceful tenure by supporting Dourson. But even if she for some reason shies away from a no vote, there are many other Republican senators whose states also have military installations with rampant pollution affecting adjacent communities.

Many more Republican senators should be unnerved

With Camp Lejeune as a haunting example of military pollution of its own soldiers and adjacent communities, the US armed forces are in the midst of investigating potential water contamination at nearly 400 such active and shuttered sites. That fact should unnerve many more Republicans, even those who generally support Pruitt’s actions. According to a Politico report three weeks ago, Senators Jeff Flake and John McCain of Arizona, Pat Toomey of Pennsylvania, and Bob Corker of Tennessee were noncommittal about supporting Dourson’s nomination.

Toomey’s office released a statement also reported by the Bucks County Courier Times saying he “remains concerned about the PFOA issue” in towns next to closed military bases in the Philadelphia area, where compounds from firefighting foams may have leached into drinking water sources. Elevated levels of pancreatic cancer have been found in the area.

With so much concern about elevated levels of cancer around the nation linked to water pollution, this is not the time to put someone in charge who made a career out of downplaying the risks of chemicals. It is bad enough that Dourson is already at EPA, advising Pruitt. But that remains a long way from actually having his hand on the pen that can help sign away people’s safety.

He should never hold that pen.

Concerned? 

Call your senator today and ask him or her to oppose the confirmation of Michael Dourson!

The Future of Solar is in the President’s Hands. It *Should* Be an Easy Call

UCS Blog - The Equation (text only) -

Installing solar panels in PA Photo: used with permission from publicsource.org

The saga of the would-be solar tariffs that just about nobody wants is continuing, and I can’t help but be struck by the disconnect between some of the possible outcomes and the administration’s purported interest in rational energy development for America. If President Trump believes what he says, deciding not to impose major tariffs shouldn’t be a tough decision.

Here’s the thing: in March 2017, the president issued an executive order about “undue burdens on energy development,” which said (emphasis added) that it was:

Solar’s future: Progress or pain? It’s his call.

…in the national interest to promote clean and safe development of our Nation’s vast energy resources, while at the same time avoiding regulatory burdens that unnecessarily encumber energy production, constrain economic growth, and prevent job creation.

Encumbering, constraining, preventing. Remember those verbs as we go through some of the key facts of this case.

The players

The trade case, brought by two US solar panel manufacturers that are on the rocks, or whose foreign parents are, involves a little-used (and failure-prone) provision in the US tax code. And it has met with almost universal rejection, from a whole host of industry, political, security, and conservative and really conservative voices (Sean Hannity, anyone?).

Even the US International Trade Commission (USITC) tasked with making recommendations in response to the petition couldn’t agree, with the four commissioners coming up with three different proposals.

As we said at the time, on the one hand it was good that the USITC recommendations weren’t as drastic as what the petitioners had asked for. On the other hand, anything that slows down our solar progress is bad news for America.

The (pre-Trump) progress

Solar has been on an incredible trajectory for years now, producing energy, cutting pollution, increasing energy security, and helping homes and businesses. The first nine months of 2017, for example, saw solar producing 47% more electricity than in the same period of 2016, with the biggest gains among the top 10 states for solar generation being in Georgia, Texas, and Utah.

Solar has also been an incredible job-creating machine. Some 260,000 people worked in the solar industry by the end of 2016, almost 2.5 times 2011’s solar job count. One in every 50 new American jobs last year was created by the solar industry. And those have been in different pieces of the industry—R&D, manufacturing, sales, project development, finance, installation—and all across the country.

The problem and presaging

Credit: J. Rogers

Some of those gains have taken place during the Trump presidency, and maybe he can rationalize taking credit for them by pointing out the fact that he at least didn’t stop those good things from happening.

That benign neglect may be about to change, though, and we’re already seeing the effects of the uncertainty that the president’s rhetoric around issues of solar and trade has created.

The trade case has continued. While not part of the specified process for this type of proceeding, the White House invited the public to submit comments to the US trade representative, and recently held a public hearing.

The next deadline is January 26, the end of the period for President Trump to make up his mind about the USITC recommendations—accepting one of the sets of proposals, doing something else, or rejecting the idea of tariffs and quotas.

In the meantime, the effects are already hitting: Utility-scale solar costs had dropped below $1 per watt for the first time in history earlier this year. Now those costs have climbed back above that mark as developers have scrambled to get their hands on modules ahead of whatever’s coming.

Large-scale solar projects are faltering (as in Texas) because of the inability of developers and customers to absorb the risk of substantially higher solar costs. That’s investment in projects on American soil, on hold.

But those setbacks could be just a taste of what’s to come.

The point: Encumbering, constraining, preventing

That brings us back to the March executive order, which boldly professed an intention to do away with burdens holding back US industry, and was decided anti-interventionist (in the regulatory sense).

And yet here we are, a few short months later, talking about doing that exact thing—messing with the market, and going against our national interests. Encumbering energy production by driving up the costs of the cells and modules that have powered so much growth. Constraining economic growth by making it harder for American homes and businesses and utilities to say yes to solar. Preventing job creation—even causing job losses—by shrinking the market for what our nation’s vibrant solar industry has been offering so successfully.

Credit: J. Rogers

The pain

While provisions in the tax bill being worked out in congress would do no good for renewables, the president’s actions could have much more direct impacts on American pricing and competitiveness. A lot of smart people are pointing out that any bump-up in US solar module manufacturing jobs will be way more than offset by job losses elsewhere in the industry, including elsewhere in solar manufacturing.

If the president chooses to ignore the many voices clamoring for rational policy on this, if he chooses—and remember he alone can fix this—to impose major tariffs or quotas, he’s going to own their impacts.

Every net American job lost because of higher module prices will have his name on it.

Every US solar panel manufacturer that doesn’t magically take off behind his wall of protectionism will be evidence of the misguideness of his approach.

Every small or large US solar project cancelled—jobs, investments, and all—because of the speedbumps, roadblocks, and hairpin turns on his energy vision-to-nowhere will be a Trump-branded monument to his lack of foresight and unwillingness to accept the changing realities of energy, innovation, and ingenuity.

The path

The solar industry, though, has offered President Trump a way out. They’ve proposed an import licensing fee approach that would support expanded US manufacturing while letting solar continue to soar (all else being equal).

That’s fortunate for the president, and for just about all of the rest of us. Because if he’s truly about unencumbering energy production, about removing constraints to economic growth, and stopping the prevention of job creation, killing American solar jobs would be a funny way to show it.

Public Source

New Transmission Projects Will Unleash Midwestern Wind Power—And Save Billions

UCS Blog - The Equation (text only) -

As we look ahead to our clean energy future, a key piece of the puzzle is building the transmission system that will carry utility-scale renewable energy from where it’s generated to where it’s consumed. A recent study from the Mid-Continent Independent System Operator (MISO) shows that, when done right, transmission projects integrated with renewable energy can pay huge dividends. They decarbonize our electricity supply, improve efficiency, and lower costs to the tune of billions of dollars in benefits to electricity customers.

A long journey to get it right

Transmission projects can cost-effectively accelerate our clean energy transition. But it must be done right with proper planning, stakeholder engagement, and diligent analytics.

Ensuring long-term investments in our transmission system provide benefits to customers is a lengthy process. Beginning in 2003, MISO—which operates the electricity transmission system and wholesale electricity markets across much of the central US—began to explore a regional planning process that would complement the local planning and activities of the utilities, states, and other stakeholders operating in its territory.

After several years of scoping, planning, analysis, and legal wrangling, a set of 17 “multi-value” transmission projects (MVPs) were approved in 2011 based on their projected ability to (1) provide benefits in excess of costs, (2) improve system reliability, and (3) provide access to renewable energy to help meet state renewable energy standards.

Even six-plus years after being approved, most of these projects are currently under construction since transmission projects typically take several years to move through the approval process, permitting, siting, and construction. But even as these projects are being developed, MISO has continued to evaluate them based on the most recent information available—making sure that they are still expected to deliver the benefits originally projected.

The most recent review, fortunately, shows that they are truly living up to their “multi-value” moniker. And like a fine wine, they seem to be getting better with time.

Latest review shows benefits increasing compared to original projections

Overall, the latest review shows a benefit to cost ratio ranging from 2.2 to 3.4—meaning these projects are expected to deliver economic benefits on the order of $2.20 to $3.40 for every dollar in cost. This is an increase over the original projection of a cost benefits ratio of 1.8 to 3.0. The latest cost/benefit analysis equates to total net economic benefits between $12.1 and $52.6 billion over the next 20 to 40 years. The figure below shows how the multiple values projected from these projects add up.

The chart above shows the categories – and projected value – of benefits (columns one through 6) that MISO considers in identifying and approving projects. When stacked up, the total benefits range from $22.1 to $74.8 billion. When total costs are also considered, net benefits (the last column on the right) to the MISO System and customers that rely on it drop to between $12 and $52.6 billion. Source: MISO

As shown in the figure, the bulk of economic benefits flowing from the MVPs are from relieving congestion and saving on fuel costs (shown in column 1). These are typically characterized as increasing “market efficiency” by opening up wholesale electricity markets to more robust competition and spreading the benefits of low-cost generation throughout the region—essentially allowing cheap energy to flow where there’s demand. Because renewable energy has zero fuel cost, enabling more of it onto the grid allows the overall system to operate more cheaply. These savings ultimately flow to ratepayers that are typically on the hook for fuel costs incurred by their utility.

And the amount of wind energy that is being brought onto the system because of these MVPs is significant. This latest review by MISO estimates that the portfolio of projects, once completed, will enable nearly 53 million megawatt-hours of renewable energy to access the system through 2031. To put that in perspective, a typical home uses about 10 megawatt-hours per year. So that’s enough energy to power 100,000 households for more than 50 years!

A lot more than just electricity

When put together, the combination of well-thought-out transmission investments and renewable energy development in the Midwest also provides a host of additional social benefits, including:

  • Enhancing the diversity of resources supplying electricity to the system
  • Improving the robustness of the transmission system that decreases the likelihood of blackouts
  • Increasing the geographic diversity of wind resources, thereby improving average wind output to the system at any given time
  • Supporting the creation of thousands of jobs and billions of dollars in local investment
  • Reducing carbon emission by 13 to 21 million tons annually

Let’s think about this for one second more…

Through proper planning, stakeholder engagement, and diligent analytics, here in the Midwest we are building a portfolio of transmission projects that will significantly lower carbon emissions, enable billions of dollars in investment and thousands of new jobs, make our electricity supply more reliable, and provide billions in economic benefits to ratepayers.

Maybe we should think about it for one more second. Or maybe we should start thinking about what’s next?

Source: MISO

Reentry of North Korea’s Hwasong-15 Missile

UCS Blog - All Things Nuclear (text only) -

Photos of the Hwasong-15 missile North Korea launched on its November 29 test suggest it is considerably more capable than the long-range missiles it tested in July. This missile’s length and diameter appear to be larger by about 10 percent than July’s Hwasong-14. It has a significantly larger second stage and a new engine in the first stage that appears to be much more powerful.

While we are still working through the details, this strongly implies that North Korea could use this missile to carry a nuclear warhead to cities throughout the United States. A final possible barrier people are discussing is whether Pyongyang has been able to develop a reentry vehicle that can successfully carry a warhead through the atmosphere to its target, while protecting the warhead from the very high stresses and heat of reentry.

Here are my general conclusions, which I discuss below:

  1. North Korea has not yet demonstrated a working reentry vehicle (RV) on a trajectory that its missiles would fly if used against the United States.
  2. However, there doesn’t appear to be a technical barrier to building a working RV, and doing so is not likely to be a significant challenge compared to what North Korea has already accomplished in its missile program.
  3. From its lofted tests, North Korea can learn significant information needed for this development, if it is able to collect this information.
  4. While the United States put very significant resources into developing sophisticated RVs and heatshields, as well as extensive monitoring equipment to test them, that effort was to develop highly accurate missiles, and is not indicative of the effort required by North Korea to develop an adequate RV to deliver a nuclear weapon to a city.

The Hwasong-15 RV

When the photos appeared after North Korea’s November 29 missile launch, I was particularly interested to see the reentry vehicle (RV) on the top of this missile. The RV contains the warhead and protects it on its way to the ground. It is significant that the Hwasong-15 RV is considerably wider and blunter than that on the Hwasong-14 (Fig. 1).

Fig. 1. The RVs for the Hwasong-14 (left) and Hwasong-15 (right), roughly to scale. (Source: KCNA)

This fact has several implications. The new RV can clearly accommodate a larger diameter warhead, and the warhead can sit farther forward toward the nose of the RV. This moves the center of mass forward and makes the RV more stable during reentry. (This drawing shows how the cylindrical nuclear weapon in the US Trident II RV, which was roughly the same size and shape, although much heavier, than the Hwasong-15 RV.)

But the blunter nose on the Hwasong-15 RV also helps protect it from high atmospheric forces and heating during reentry. Here’s why:

As the RV enters the atmosphere, drag due to the air acts as a braking force to slow it down, and that braking force puts stress on the warhead. At the same time, much of the kinetic energy the RV loses as it slows down shows up as heating of the air around the RV. Some of that heat is transferred from the air to the RV, and therefore heats up the warhead. If the stress and/or heating are too great they can damage the RV and the warhead inside it.

A blunter RV has higher drag and slows down in the thin upper parts of the atmosphere more than does a slender RV, which continues at high speed into the thick lower parts of the atmosphere. This results in significantly less intense stress and heating on the blunter RV. In addition to that, a blunt nose creates a broad shock wave in front of the RV that also helps keep the hot air from transferring its heat to the RV.

Fig. 2. This shows two low-drag RVs being placed on a Minuteman III missile, which can carry three RVs. (Source: US Air Force).

A rough estimate shows that if the RVs had the same mass and flew on the same trajectory, the peak atmospheric forces and heating experienced by the Hwasong-14 RV in Fig. 1 would be roughly four or more times as great as that experienced by the Hwasong-15 RV; those on a modern US RV, like that on the Minuteman III missile (Fig. 2), might be 20 times as large as on the Hwasong-15 RV.

The tradeoff of having a blunt warhead is that when the RV travels more slowly through the atmosphere it reduces its accuracy. In order to get very high accuracy with its missiles, the United States spent a tremendous amount of effort developing highly sophisticated heatshields that could withstand the heating experienced by a slender, low-drag RV.

For North Korea, the decrease in accuracy due to a blunt RV is not particularly important. The accuracy of its long-range missiles will likely be tens of kilometers. That means that it would not use its missiles to strike small military targets, but would instead strike large targets like cities. For a large target like that, the reduction in accuracy due to a blunt RV is not significant.

What could North Korea learn from its recent test?

Press stories report US officials as saying that the reentry vehicle on North Korea’s November 29 test “had problems” and “likely broke up” during reentry. If true, this implies that the RV used on this flight could not withstand the strong drag forces as the RV reached low altitudes.

It’s worth noting that the drag forces on the RV during reentry on the lofted trajectory would be more than twice as great as they would be on a standard trajectory of 13,000 km range flown by the same missile (Fig. 3). This is because on the flatter trajectory, the RV flies through a longer path of thin air and therefore slows down more gently than on the lofted trajectory. It is therefore possible the RV might survive if flown on a standard trajectory, but North Korea has not yet demonstrated that it would.

However, given the estimated capability of the Hwasong-15 missile, North Korea appears to have the option of strengthening the RV, which would increase its mass somewhat, and still be able to deliver a warhead to long distances.

Fig. 3. This figure shows the atmospheric forces on the RV with altitude as it reenters, for the highly lofted test on November 29 (black curve) compared to the same missile flying a 13,000 km standard  trajectory (a minimum-energy trajectory, MET). The horizontal axis plots the product of the atmospheric density and square of the RV speed along its trajectory, which is proportional to the drag force on the RV. The calculations in all these figures assume a ballistic coefficient of the RV of 100 lb/ft2 (5 kN/m2). Increasing the ballistic coefficient will increase the magnitude of the forces and move the peaks to somewhat lower altitudes, but the comparative size of the curves will remain similar.

The situation is similar with heating of the RV. The last three columns of Fig. 4 compare several measures of the heating experienced by the RV on the lofted November 29 test to what would be experienced by the same RV on a 13,000 km-range missile on a standard trajectory (MET).

Fig. 4. A comparison of RV forces and heating on the November 29 test and on a 13,000 km-range trajectory, assuming both missiles have the same RV and payload. A discussion of these quantities is given in the “Details” section below.

These estimates show that the maximum heating experienced on the lofted trajectory would be about twice that on a standard trajectory, but that total heat absorbed by the RV on the two trajectories would be roughly the same. Because the heating occurs earlier on the RV on the standard trajectory than on the lofted trajectory, that heat has about 130 seconds to diffuse through the insulation of the RV to the warhead, while the heat on the lofted trajectory diffuses for about 80 seconds (Fig. 5). This somewhat longer time for “heat soak” can increase the amount of heat reaching the warhead, but North Korea would put insulation around the warhead inside the RV, and the heat transfer through insulators that North Korea should have access to is low enough that this time difference is probably not significant.

Fig. 5: This figure shows how the heating rate of the RV surface varies with time before impact on the lofted and standard trajectory. The areas under the curves are proportional to the total heat absorbed by the RV, and is only about 20% larger for the MET. The vertical axis plots the product of the atmospheric density and the cube of the RV speed along its trajectory, which is proportional to the heating rate on the RV.

Fig. 6 shows heating on the two trajectories with altitude.

Fig. 6. This figure shows the heating of the RV with altitude as it reenters.

These results show that if North Korea were able to demonstrate that its RV could survive the peak drag forces and heating on a lofted trajectory, it should also be able to survive those on a standard trajectory. As noted above, the estimated capability of the Hwasong-15 missile suggests North Korea would be able to increase the structural strength of the RV and its heat shielding and still be able to deliver a warhead to long distances.

There is still some question about what information North Korea may actually be getting from its tests. One advantage of testing on highly lofted trajectories that fall in the Sea of Japan is that the RV can presumably radio back data to antennae in North Korea for most of the flight. However, because of the curvature of the Earth, an antenna on the ground in North Korea would not be able to receive signals once the RV dropped below about 80 km altitude at a distance of 1000 km. To be able to track the missile down to low altitudes it would likely need a boat or plane in the vicinity of the reentry point.

Some details

The rate of heat transfer per area (q) is roughly proportional to ρV3, where ρ is the atmospheric density and V is the velocity of the RV through the atmosphere. Since longer range missiles reenter at higher speeds, the heating rate increases rapidly with missile range. The total heat absorbed (Q) is the integral of q over time during reentry. Similarly, forces due to atmospheric drag are proportional to ρV2, and also increase rapidly with missile range.

The calculations above assume a ballistic coefficient of the RV equal to 100 lb/ft2 (5 kN/m2). The ballistic coefficient β = W/CdA (where W is the weight of the RV, Cd is its drag coefficient, and A is its cross-sectional area perpendicular to the air flow) is the combination of parameters that determines how atmospheric drag reduces the RV’s speed during reentry. The drag and heating values in the tables roughly scale with β. A large value of β means less atmospheric drag so the RV travels through the atmosphere at higher speed. That increases the accuracy of the missile but also increases the heating. The United States worked for many years to develop RVs with special coatings that allowed them to have high β and therefore high accuracy, but could also withstand the heating under these conditions.

Based on the shape of the Hwasong-15 RV, I estimate that its drag coefficient Cd is 0.35-0.4. That value gives β in the range of 100-150 lb/ft2 (5-7 kN/m2) for an RV mass of 500-750 kg. The drag coefficient of the Hwasong-14 RV is roughly 0.15.

The Penn State Science Policy Society: Filling the Gap Between Science and Community

UCS Blog - The Equation (text only) -

Graduate school. It’s where generations of scientists have been trained to become independent scientists. More than 60 hours per week spent in lab, countless group meetings, innumerable hours spent crunching data and writing manuscripts and proposals that are filled with scientific jargon.

Unfortunately, it’s this jargon that prevents scientists from effectively communicating their science to the non-technical audiences that need it. Penn State’s Science Policy Society aims to bridge this gap by helping current graduate students and post-doctoral fellows learn how to bring their research into the community.

We occupy an important niche at Penn State as we continue to educate members of the Penn State community about the connection between our research and public policy, with a dedicated focus on science advocacy. We are helping our future scientists translate their stories and make connections with community members and policy makers.

Identifying a gap between science and community

Penn State researcher Dr. Michael Mann discussing the science behind climate change at Liberty Craft House in downtown State College.

Early on, we recognized a growing disconnect between the local State College community and the groundbreaking research occurring at Penn State. A growing desire within the Science Policy Society became apparent. Our members wanted to help our fellow community members, but we didn’t have the skills or the relationships within the community. We began to plan events to address this problem, looking to others who have fostered strong community ties as guides.

We began our relationship with the Union of Concerned Scientists (UCS) in March 2016 when Liz Schmitt and Dr. Jeremy Richardson came to Penn State to discuss UCS’s efforts to promote science-community partnerships. In May 2016, SPS members traveled to Washington D.C. to meet with UCS staff for science advocacy training. With the help of UCS, we have been able to begin to build our own community relationships. We started with Science on Tap, a monthly public outreach event designed to showcase Penn State science in a casual downtown bar setting. By having leaders in science-community partnerships to guide us, we have been able to begin our own journey into outreach.

Science & Community: A panel event

While our Science on Tap events were successful, we still felt there was still a gnawing gap between Penn State science and our local community. The local news was filled with science-related issues in State College and the surrounding central Pennsylvania region, but it wasn’t obvious how science was being used to help decision makers. We recognized an urgent need to learn how other scientists use their science to help, or even become, activists that fight for their local community.

The Science Policy Society panel discussion on Science & Community. From left to right: Dr. David Hughes, Dr. Maggie Douglas, and Dr. Thomas Beatty.

On September 14, 2017, the Science Policy Society partnered with the Union of Concerned Scientists to organize an event called “Science & Community.” Taking place at the Schlow Centre Region Library, the event was a panel discussion focused on how scientists and community activists can work together. The event featured three Penn State researchers: Dr. Maggie Douglas and Dr. David Hughes from the Department of Entomology, and Dr. Thomas Beatty from the Department of Astronomy and Astrophysics. Dr. Douglas works closely with local beekeepers and farmers to promote pollinator success, while Dr. Hughes is a leading member of the Nittany Valley Water Coalition, an organization that aims to protect the water of State College and the farmland it flows under. Dr. Beatty is a member of Fair Districts PA and speaks across central Pennsylvania about gerrymandering.

All three of these scientists saw problems in their community and decided to take action. Even more remarkable, most of these issues are outside their areas of scientific expertise. Astronomers typically aren’t trained in political science, but that did not stop Dr. Thomas Beatty from applying his statistical toolset to impartial voter redistricting. Same with Drs. Hughes and Douglas, who took their expertise into the community to help farmers and beekeepers protect their livelihoods.

Lessons learned

Easily the most important lesson that we learned from this Science & Community panel event was how hard it is for scientists to move into the local community and begin these conversations and partnerships. There was an overwhelming sense that the majority of the scientists in attendance did not feel comfortable using their scientific expertise to engage on local community issues. The reasons were numerous, but seemed to focus on (1) not knowing how to translate their science so that it is useful for non-specialists and (2) not having enough room in their schedule.

Moving forward, the Science Policy Society is aiming to address these concerns as we work towards filling the void between Penn State science and the surrounding communities. For example, we will be hosting science communication workshops to train scientists on how to strip jargon from their story of scientific discovery. Additionally, a panel event currently being planned for Spring 2018 aims to discuss how science and religion are not mutually exclusive, and will show how scientists can work with religious organizations and leaders to promote evidence based decision-making.

Graduate students looking to help their community are not given the necessary tools needed to do so. Hours spent in lab and at conferences talking only in scientific jargon leaves many unable to talk about their science to the general public. The Science Policy Society is filling this need by providing an outlet for scientists to learn communication and advocacy skills and begin to build relationships with community members and policy makers. With help from scientists and science outreach professionals, we are fostering science and community partnerships in State College and throughout central Pennsylvania.

 

Jared Mondschein is a Ph.D. Candidate in the Department of Chemistry at Pennsylvania State University. He was born and raised near New York City and earned a B.S. in chemistry from Union College in 2014. He is currently a Ph.D. candidate in the Department of Chemistry at Penn State University, where he studies materials that convert sunlight into fuels and value-added chemical feedstocks. You can find him on Twitter @JSMondschein.

Theresa Kucinski is a Ph.D. Candidate in the Department of Chemistry at Pennsylvania State University. She was born and raised in northern New Jersey, earning her A.S. in chemistry at Sussex County Community College in 2014 and B.A. in chemistry from Drew University in 2016. She currently studies atmospheric chemistry at Penn State University as a Ph.D. candidate in the Department of Chemistry.

Grayson Doucette is a Ph.D. Candidate in the Department of Materials Science and Engineering at Pennsylvania State University. He was born into a military family, growing up in a new part of the globe every few years. He earned his B.S. in Materials Science and Engineering at Virginia Tech in 2014, continuing on to Penn State’s graduate program. At PSU, his research has focused on photovoltaic materials capable of pairing with current solar technologies to improve overall solar cell efficiency. You can find him on Twitter @GS_Doucette.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

Why You Shouldn’t Feel Bad About Recycling Old Appliances

UCS Blog - The Equation (text only) -

Let’s face it: Deep down inside you, or maybe much closer to the surface, you’ve been wanting a new refrigerator, dishwasher, washer, or dryer. You’ve had your eye on that sweet little white/black/stainless beauty of a machine, and you’ve seen the holiday sales (pick a holiday, any holiday) come and go, with their “Save $200!… Free delivery!… Act now!” enticements… And yet you’ve stayed on the sidelines.

If what’s been holding you back is concern about what happens to old appliances, landfills and all, I’ve got great news for you: Chances are good that you’re better off if you upgrade, because energy efficiency progress means you can save plenty of money—and that all of us are also better off because that progress means your upgrade also cuts emissions, even when you take the bigger picture into account.

New appliances make financial sense

It should be really clear that new appliances can save you a bunch of money by saving energy (and more). Federal efficiency standards for fridges that came into place in 2014 meant electricity savings of 20-25% for most, and units qualified under the ENERGY STAR program offer at least another 9% savings.

For washing machines, ENERGY STAR-rated ones use 25% less energy and 45% less water than their conventional brethren, which means less money spent on both energy and water. Upgrading from a standard washing machine that’s 10 years old can actually save you more than $200 a year.

New appliances make environmental sense, too

So that’s the financial side of things. And we both know that’s important.

But we also both know that you’re about much more than that. You’re thinking about how that dishwasher doesn’t just magically appear, about how the old one doesn’t just vanish. You’re thinking about the implications from each stage of its life. So what about the carbon emissions, you say.

Thinking about what goes into producing and disposing of something makes a lot of sense, as long as you’re thinking about what goes into operating that same something during that long period between production and disposal (the life of the product).

And it makes even more sense to use data to help that thinking. (You’re a Union of Concerned Scientists type of person, after all; you just can’t help it.)

Fortunately, we’ve got that. Cooler Smarter, UCS’s book on where the carbon emissions come from in our lives—which of our consumer decisions have the most impact on how much CO2 we emit—has just the data you need. (In the appendices; we didn’t want to scare off other people.)

And what Cooler Smarter’s data tables show is that the emissions associated with producing and disposing of a range of appliances add up to less than the emissions associated with their use. A lot less actually: Using them can take 10-25 times as much energy as getting them there and getting rid of ‘em.

Getting Cooler Smarter about where the emissions come from, with data. Turns out that “Use Emissions” are usually the big piece. (Source: Cooler Smarter)

What that means is that if you can upgrade an appliance to one that’s more efficient, and particularly if your existing helper is more than a few years old, it’s probably really worth it not just from a financial perspective, but also in terms of carbon pollution.

That same principle, by the way, holds true for other energy users around your house: think lighting, for instance, where CFLs (compact fluorescent lights) or even newer LEDs (light-emitting diodes) in place of incandescent light bulbs can really quickly save you a bundle and pay back the emissions that went into make them. Or think vehicles, where recent years’ efficiency gains have been really impressive.

As it says in Cooler Smarter:

When there are highly efficient options for appliances, equipment, and vehicles, for instance, it almost always makes sense to junk energy hogs in favor of the most efficient models you can afford.

Four decades of progress in a box: Bigger fridges, more features, a lot less energy. (Source: ACEEE)

Old appliances can be reborn

For the disposal piece of the carbon equation, one key to making the math work for an appliance’s afterlife is to dispose of it the right way. While photos of piles of old appliances might be eye-catching—and disheartening—your old faithful dishwasher, washing machine, dryer, or fridge doesn’t have to suffer that ignominious end.

In fact, it’s a whole lot better if it doesn’t, and there are lots of ways to make it so. ENERGY STAR has a useful set of webpages on recycling old appliancesrefrigerators, clothes washers, other appliances, and more. It suggests, for example, that recycling can be through the store you’re buying the new appliance from, through your local utility, through your city or town, or via a scrap dealer.

How your old fridge gets new life (with the help of a Hammond B3 organ soundtrack) (Source: ENERGY STAR)

As for where the old appliance goes/how the materials find new life: Fridges are a useful, complex array of materials that provide useful insights (and fodder for graphics). ENERGY STAR has a handy video about all the pieces and how they get reborn. (The shredding part about two-thirds of the way through isn’t for the faint of heart, particularly the appliance-loving heart, but just remember that it’s all for the greater good.) And the efficiency program in top-ranked Massachusetts not only gives the lowdown on fridge recycling (and a cool infographic), but offers free removal and $50 to boot.

That new-life-for-old idea can work for other things, too. If it’s lights you’re swapping out, here are a few ideas on what to do with old incandescent light bulbs (sock-darning, for example). For vehicles, check out UCS’s cradle-to-grave analysis.

Don’t you deserve lower costs, more comfort, less pollution, more…?

A new washer and dryer set might not fit under the Christmas tree, but that shouldn’t keep you from upgrading. Neither should concerns about what happens to the old one, or where the new one comes from.

As Cooler Smarter‘s section on “stuff we buy” lays out, there’s a lot to be said for buying less, and buying smart. But efficiency gains change the equation for some things.

If you feel you deserve new appliances, you just might be right. And if you think that upgrading to much higher efficiency ones and recycling the old might be a good move, you’d definitely be right.

Energy efficiency truly is the gift that keeps on giving, for both the wallet and the planet.

So act now—retailers are standing by!

Automakers’ Long List of Fights Against Progress, and Why We Must Demand Better

UCS Blog - The Equation (text only) -

Vehicle pollution is a major issue for human health and the environment.

Today, we are releasing a report documenting the long, sordid past of the auto industry, who has fought regulation tooth and nail at every turn. From pollution control to seatbelts and air bags to fuel economy, the industry has spent the vast majority of the past 7 decades doing whatever it can to wriggle out of government regulations, at the expense of the American public.

Cars have drastically improved, but not without a fight

Time for a U-turn looks at the tactics that automakers consistently deploy to fight against federal rules and standards that deliver better cars to the nation, tactics like exaggeration, misinformation, and influence. It also outlines concrete actions that automakers can take to leave behind their history of intransigence, and ensure that their industry rises to the challenges of the 21st century.

There is no doubt that the cars built today are significantly improved over the vehicles from the 1950s:

  • today’s safety standards require not just airbags and seatbelts but also features like crumple zones which help to minimize occupant injury;
  • tailpipe pollution standards have dramatically reduced the emissions of soot and smog-forming pollutants like volatile organic compounds and nitrogen oxides; and
  • fuel economy and global warming emissions standards have saved consumers about $4 TRILLION dollars in fuel, severely reducing both the demand for oil and impact on climate change.

It’s clear when put to the task, automotive engineers have been more than capable of meeting whatever challenge is laid in front of them, resulting in a tremendous positive impact for the public.

Unfortunately, the industry has a long history of putting its lobbyists to work instead, promoting misleading claims and interfering politically to weaken or delay the standards that protect the public.

Automotive Chicken Little and a “Can’t Do” Attitude

One of the most frustrating aspects of the volumes of research I did for this report was the sheer repetition of the arguments.  According to the auto industry, any type of regulation would force them out of business…and yet they are still here.  Here are a few examples:

“[I]f GM is forced to introduce catalytic converter systems across the board on 1975 models . . . it is conceivable that complete stoppage of the entire production could occur, with the obvious tremendous loss to the company, shareholders, employees, suppliers, and communities.” – Ernie Starkman (GM) in his push to weaken the 1975 tailpipe emissions standards put in place by the Clean Air Act.

Not only was Starkman wrong that catalytic converters would shut down GM, but they proved so popular that GM actually used them in its advertising in 1975!

“Many of the temporary standards are unreasonable, arbitrary, and technically infeasible. . . . [If] we can’t meet them when they are published we’ll have to close down.” – Henry Ford II (Ford), responding to the first motor vehicle safety standards.

Clearly, Ford did not have to close down.  In fact, Ford proved more than capable of meeting these “unreasonable” requirements by using features like safety glass and seat belts, which are commonplace today.

“We don’t even know how to reach [35 miles per gallon by 2020], not in a viable way.  [It] would break the industry.”  — Susan Cischke (Ford), discussing the requirements of the Energy Independence and Security Act (EISA) that have led to the strong standards we have today.

Not only have strong fuel economy standards not broken the industry, but today it is thriving, with three consecutive years of sales over 17 million, an historic first for automakers.  And because of standards that drive improvements across all types of vehicles, we are not only on track to meet the requirements of EISA but doing so in spite of a growing share of SUVs and pick-ups.

Fighting the Science

Of course, even worse than the repetitive “sky is falling” attitude that has proven false at every turn is the assault on science that automakers have used in the past, seeking to eliminate policy action by diminishing either the solution or the problem:

“We believe that the potential impact of [fuel economy standards] on the global issue of planetary warming are [sic] difficult to demonstrate.” – Robert Liberatore (Chrysler)

Believe it or not, after James Hansen’s Congressional testimony in 1988, there was bipartisan support on the Hill to address climate change, including from transportation-related emissions.  Mr. Liberatore used an argument straight out of today’s Heritage Foundation claiming that fuel economy standards in the United States won’t have an impact on a global problem.  This flew in the face of science then, just as it does now.

“The effects of ozone are not that serious . . . what we’re talking about is a temporary loss in lung function of 20 to 30 percent.  That’s not really a health effect.” – Richard Klimisch (American Automobile Manufacturers Association).

In 1996, the EPA was moving forward to strengthen air quality standards for ozone (related to smog) and soot (particulate matter).  In order to push back on this solution, automakers campaigned against there even being a problem to address, claiming that a little loss in lung function wasn’t a big deal.  Needless to say, the EPA ignored this ridiculousness and implemented stronger standards. However, even these stronger standards did not fully address the problem, pushing the Obama administration to move forward on strengthening the standards further still.

Breaking the Cycle?

After the Great Recession, automakers seemed to turn over a new leaf, working closely with the Obama administration to craft stringent fuel economy and emissions standards that would drive efficiency improvements across all types of vehicles, including SUVs and pick-up trucks.

“[The industry has] had a change of heart, but it’s fairly recent. We had data about consumers’ preferences about fuel economy, but we chose to ignore it; we thought it was an anomaly. But it’s by having a bias against fuel economy that we’ve put ourselves in the pickle we’re in now.”  — Walter McManus (ex-GM), speaking about a shift in automaker thinking.

Unfortunately, this awakening seems to have been short-lived, as automakers are now urging the current administration to weaken the standards with the same types of tactics we’ve seen before:

  • Automakers are using direct political influence, sending a letter to the Trump administration to withdraw EPA’s determination that the strong 2025 standards remain appropriate.
  • Automakers are again exaggerating the facts, claiming widespread catastrophe if the EPA does not alter the standards based on a widely debunked study and ignoring the findings of a more thorough (albeit still conservative) report they themselves funded because it doesn’t fit their messaging.
  • Industry is pushing to expand the midterm review to include lowering the 2021 standards while acknowledging that lowering the 2021 standards would have no impact on their product offerings and simply is a form of regulatory relief “any way we can get it” (Chris Nevers, Alliance of Automobile Manufacturers).

Despite talking a good game about being “absolutely committed to improving fuel efficiency and reducing emissions for our customers” (Bill Ford, 2017), Ford and other automakers are engaging in the same intransigence we’ve seen over the past seven decades.

It’s time for automakers to end this multidecadal war against regulation and start siding with progress.  To build back trust and leave this history behind, automakers must seize this opportunity and:

  • support strong safety and emissions standards and keep the promises they made to the American people to build cleaner cars;
  • distance themselves from trade groups that seek to undermine today’s standards, and make it clear that these groups do not speak for all automakers on issues of safety and the environment; and
  • cease spreading disinformation about the standards and their impacts.

Supermoons, King Tides, and Global Warming

UCS Blog - The Equation (text only) -

The moon rises behind the US Capitol. Photo by: Jim Lo Scalzo, EPA

Were you, like me, dazzled by the supermoon this weekend? Did you also stare in a state of wonder at the bright and shiny orb of color illuminating the night? Supermoons happen when a full or new moon is at its closest point to Earth. While we can’t see them during the new moon, supermoons that occur during a full moon are indeed something to behold. They bring thoughts of the universe, of space, stars and planets.

Flooding in Boston wharf.
Photo by: MyCoast, Christian Merfeld

But while we are turning our heads to the sky, we may not realize what’s happening at our feet. The moon might be out in space, but its movement has real impacts here on Earth, specifically on the oceans. I am talking about tides.

Tides are all about big masses of land and water pulling one another in a gravitational act. Tides are always higher at full and new moons — when the Moon, Earth, and Sun are aligned — and it follows that the gravitational pull is strongest when the masses are at their closest during a supermoon. That’s why we saw some unusually high tides, called king tides, across the country (and beyond) at the same time that we experienced the supermoon.

So, while we may not realize it when looking at the supersized moon, it is causing a great deal of disruption to people’s lives in the form of tidal flooding, also called “nuisance flooding.” As stated in one of my colleague’s earlier blogs, this localized tidal flooding has been steadily increasing due to sea level rise. And climate change is behind the sea level rise rates being observed.

The recently released Climate Science Special Report (CSSR) states with very high confidence that “global mean sea level (GMSL) has risen by about 7–8 inches (about 16–21 cm) since 1900, with about 3 of those inches (about 7 cm) occurring since 1993”, and rising will continue throughout the rest of the century at accelerated rates. Rates of sea level rise in many locations along the coast of the U.S. have been higher than the global average, and nuisance flooding is now 300% to more than 900% more frequent than it was 50 years ago in many of those locations.

Many cities have initiatives to track tides and some are specifically geared toward monitoring king tides. Volunteers with “Catch the King,” an initiative by the Virginia Institute of Marine Science, can use a smartphone app to map flooded areas in Hampton Roads in real time. The group then uses the collected data to improve predictions and forecasts, and to better understand the risks from tidal flooding.

Similarly, the “My Coast” project asks Massachusetts residents to submit pictures of areas inundated by king tides to catalogue the effects of these events on the state’s coastal areas. Ultimately, these types of initiatives are geared towards improving resilience and preparedness, informing residents of impassable areas and floodwater reach.

The amount of emissions currently released into the atmosphere has already committed us to a certain amount of sea level rise through midcentury, simply because these warming gases remain in the atmosphere for a long time. However, decisions made in the next few years will determine how much the sea will rise in the second half of the century – reducing emissions can reduce the rates of rise and potentially save hundreds of coastal communities from tidal flooding.

So next time you look up at a supermoon (in January 2018), while still marveling at the incredible phenomenon you are witnessing, remember to also look down. It may just make you think about the moon in a completely different way – and how as a nation, we need to do more to reduce emissions and prepare for coastal flooding.

It’s World Soils Day: Celebrate Soil, Carbon, and the Opportunities Right Under Our Feet

UCS Blog - The Equation (text only) -

These days, stories about soil health and regenerative farming seem to be catching on, so much so that it’s almost hard to keep up, at least for the avid soil geek.  The New York Times and the Huffington Post both featured op-eds just last week explaining why soil is worth getting excited about, while tales of soil health and science from North Dakota to New England were recently shared by other sources.  Yesterday, NPR hosted an hour-long panel on soil health. And that’s just a short list.

Maybe the rush of soil-slanted stories has something to do with today being World Soils Day. Or maybe it’s because soils and agriculture finally got some love at the latest climate convention.  Or perhaps it has to do with the growing list of states that are working towards healthy soils policies, or that the conversation surrounding the next Farm Bill has actually included soil health.

Or, just maybe, it’s because people are figuring out that the soils beneath our feet, and the farmers and ranchers that tend to them, need more of our attention.  After all, healthy soils are the living, breathing ecosystems that help grow our food, clean our water, store carbon, and reduce risks of droughts and floods.  Together, soils and their stewards can produce food while making agriculture part of the solution to several challenges (including climate change). Let me explain.

Soils stash carbon and deliver services

Some of the amazing features of soils that are finally being celebrated are not new. For some time, scientists have known that soils store a lot of carbon (about three times more than the atmosphere), and that carbon-rich soils tend to hold more water.  They have also known that soil varies a lot, even across small distances, that it changes over time, and that it is affected by management practices.  But we also know that there’s a lot we don’t know.  Thankfully, that’s starting to change.

Getting the numbers right on how soil can fight a changing climate (because we can’t afford not to)

Even just in the past year, soil science – including soil carbon science –  has advanced, pushed along by new tools, interests, and urgency.  A lot of the urgency has come as climate change picks up the pace. Today, scientists say that we can’t afford to choose between reducing emissions and sequestering carbon – we must do both.  That puts a spotlight (and pressure) on soils.

Fortunately, new science is rapidly uncovering more details about soils.  For example, pivotal papers have discussed how specific soil-based management practices could help mitigate climate change, and how soil carbon sequestration could be scaled up in the US and around the globe to achieve significant outcomes. Within the past months, key papers demonstrated that the majority (75%) of the organic carbon in the top meter of soil is directly impacted by management and that croplands may hold particular potential to be managed for carbon sequestration, but that soils continue to be at risk.

It’s important to note that while many studies have stressed opportunities in soils, others have questioned them.  For example, some research has suggested that soils may not be able to hold as much carbon as some scientists think, while other research has indicated that links between soil carbon and water are not as strong as previously thought.  Other research has questioned whether certain practices (e.g., abandoning cropland) can bring expected benefits.

In my opinion, all these studies just make more research more important.  Getting the numbers right will help us to find, and fine-tune, the best solutions for healthier, more resilient soil. But as we work out these details, we also need to act – and fast.

The role of farmers and ranchers in bringing out the best in soils, for better farms and futures

Fortunately, many farmers and ranchers already know how to build soil health (and carbon) on their land – and they are taking action (lucky for us, because the health of the soil is in their hands). Farmers and ranchers like Gabe Brown (ND), David Brandt (OH), Will Harris (GA), Ted Alexander (KS), and Seth Watkins (IA), just to name a few, have been experimenting for years with ways to build soil health for more resilient land.  New research from South Dakota shows that farmers are adopting cover crops and other practices in large part to build soil health.  And a growing list of companies and non-profits have supported a standardized definition of regenerative agriculture, suggesting that these healthy soils practices are gaining even more traction.

Recognizing the soils and stewardship beneath food “footprints”

As important as soil carbon, health, and stewardship are to ensuring farms are functioning at their best, it’s surprising that we think so little about them.  There is a larger discussion going on around sustainable diets and the notion that food has an environmental “footprint,” but the fact is that most of the studies that seek to quantify the carbon (or water, or land) footprints of food items haven’t accounted for the role of soil management and stewardship. Therefore, while the conversation about the impact of consumers’ food choices has been an important starting point, we also need to understand how the decisions made by farmers affect the world around us. That means bringing soil carbon to the table, and the sooner the better. With the growing appreciation for soil health science, practice, and story-telling, I think we might be getting somewhere.

P.S.  Prefer a little video inspiration? There’s plenty to choose from if you want to learn the basics of soil organic carbon, how “dead stuff” is key to the food chain, how healthy soils reduce flood risk, or more about the 4 per mille campaign, which puts soils at the forefront of climate change solutions.

Did Pilots See North Korea’s Missile Fail during Reentry?

UCS Blog - All Things Nuclear (text only) -

News reports say that a Cathay Airlines flight crew on November 29 reported seeing North Korea’s missile “blow up and fall apart” during its recent flight test. Since reports also refer to this as happening during “reentry,” they have suggested problems with North Korea’s reentry technology.

But the details suggest the crew instead saw the missile early in flight, and probably did not see an explosion.

One report of the sighting by the Cathay CX893 crew gives the time as about 2:18 am Hong Kong time, which is 3:18 am Japan time (18:18 UTC). According to the Pentagon, the launch occurred at 3:17 am Japanese time (18:17 UTC), which would put the Cathay sighting shortly after the launch of the missile from a location near Pyongyang, North Korea.

Since the missile flew for more than 50 minutes, it would not have reentered until after 4 am Japanese time. Given the timing, it seems likely the crew might have seen the first stage burn out and separate from the rest of the missile. This would have happened a few minutes after launch, so is roughly consistent with the 3:18 time.

The New York Times posted a map that shows the track of flight CX893. It shows that the flight was over northern Japan at 6:18 pm UTC (Fig. 1) and the pilots would have had a good view of the launch. By the time reentry occurred around 7:11 pm UTC, the plane would have been over mid-Japan and reentry would have occurred somewhat behind them (Fig. 1).

Fig. 1. Maps showing the location of flight CX893 shortly after launch of North Korea’s missile near the red dot on the left map, and at the time of reentry of North Korea’s missile, which took place near the red dot on the right map. (Source: NYT with UCS addition)

Burnout of the first stage would have taken place at an altitude about 100 km higher than the plane, but at a lateral distance of some 1,600 km from the plane. As a result, it would have only been about 4 degrees above horizontal to their view—so it would not have appeared particularly high to them. Ignition of the second stage rocket engine and separation of the first stage may have looked like an explosion that caused the missile to fall apart.

There are also reports of two Korean pilots apparently seeing a “flash” about an hour after the missile’s launch, which would be consistent with the warhead heating up during reentry, since the missile flew for 53-54 minutes. Neither reported seeing an explosion, according to the stories.

Pruitt’s War on the Planet and the EPA—and What Congress Can Do About It

UCS Blog - The Equation (text only) -

We have now endured almost a year with Scott Pruitt as the head of the Environmental Protection Agency (EPA). His tenure is unprecedented—a full frontal assault on the agency he heads, and a retreat from the mission he is charged by law to advance. And thus far, Administrator Pruitt has not had to account for his actions.

But an accountability moment is nearing: for the first time since his nomination, Mr. Pruitt will appear before Congress to offer an update on the status of work at the agency—first before the House Energy and Commerce Committee on December 7, and next before the Senate Environment and Public Works Committee on January 31. These oversight hearings offer a critical opportunity for leaders on both sides of the aisle to ask tough questions, demand responsive information rather than platitudes, and voice their disapproval about how Administrator Pruitt has run the EPA.

Here are key topics for our elected representatives to focus on:

Mr. Pruitt’s empty “back to basics” promise

During his nomination hearing last January, Administrator Pruitt knew he would be questioned about his commitment to EPA’s mission and his repeated lawsuits against EPA when he served as Oklahoma’s attorney general. He came equipped with a clever counter-narrative. He claimed that he would make EPA a more effective agency by de-emphasizing “electives” such as climate change. He promised to steer the agency “back to basics” by focusing on core responsibilities such as enforcing clean air and water laws and cleaning hazardous waste sites.

Members of Congress should compare that promise to Administrator Pruitt’s actions over the past year. Almost immediately after taking office, he signed off on a budget that would cut EPA by 31 percent, despite the absence of any financial exigency requiring such draconian action. A few weeks later, he approved plans to lay off 25 percent of the agency’s employees and eliminate 56 programs. The proposed budget cuts target not only items Pruitt may think of as electives, but also basic bread-and-butter functions. For example, he proposed to strip $330 million from the $1.1 billion Superfund program and cut funding for the Justice Department to enforce cases.

And, in a clear contradiction of his testimony that he would work more cooperatively and effectively with state environmental protection agencies, he proposed to cut the grants that EPA gives to states for enforcement by 20 percent.

We are already starting to see the results of this effort to hollow EPA out from within. Experienced and talented career staff are leaving the agency in droves. The Chicago EPA office, for example, has already lost 61 employees “who account for more than 1,000 years of experience and represent nearly 6 percent of the EPA’s Region 5 staff, which coordinates the agency’s work in six states around the Great Lakes.” This means, among other things, a smaller number of inspectors and likely an increased number of businesses operating out of compliance with clean air and water laws.

With less staff and fewer experienced staff members, it is no surprise that EPA has seen a roughly 60 percent reduction in the penalties it has collected for environmental violations compared with the Obama, Bush, and Clinton administrations at comparable stages in their respective terms. And while the Obama administration cleaned up and de-listed 60 hazardous waste sites and added 142 sites over eight years, so far the EPA, under Mr. Pruitt, is far off that pace, deleting just two sites and adding only seven.

Perhaps most troubling, civil servants have been deeply demoralized by the combination of proposed cuts and constant statements by the president and Administrator Pruitt denigrating the agency as a job killer, which it is not. As one staffer said in a recent publication entitled EPA under Siege “I think there’s a general consensus among the career people that, at bottom, they’re basically trying to destroy the place.”

Said another: “Quite honestly, the core values of this administration are so divergent from my own, I couldn’t pass up the opportunity [for retirement]….I found it difficult to work for an agency with someone who is so disrespectful of what we do and why we do it.”

Members of Congress should question Mr. Pruitt about his “back to basics” promise. They should ask why he advocated for such deep budget cuts, layoffs, and buyouts, and demand that he explain with specificity how the agency can possibly do better with such drastically reduced resources. Congress should also require Mr. Pruitt to provide clear, apples-to-apples comparisons of the record of environmental enforcement during his tenure with that of his predecessors, as measured by inspections, notices of violation, corrective actions, fines and litigation.

Administrator Pruitt’s “Law and Order” charade

Administrator Pruitt put forth a second narrative during his confirmation hearing. He promised  to restore “law and order” to EPA, claiming that the EPA had strayed beyond its statutory authority during President Obama’s tenure.

The record tells a very different story. In less than a year, Mr. Pruitt’s actions have repeatedly been found by courts to be “unlawful,” “arbitrary,” and “capricious.”

One example is particularly instructive. At the end of the Obama administration, the EPA issued a final rule requiring operators of new oil and gas wells to install controls to capture methane, a highly potent contributor to global warming. The rule was set to go into effect in early 2017. Administrator Pruitt unilaterally put the rule on hold for two years to allow EPA to conduct a sweeping reconsideration. This, the court found, was blatantly illegal, because it attempted to change the compliance date of a rule without going through the necessary rulemaking process.

Unfortunately, this tactic has become a pattern, as Mr. Pruitt has sought to put on hold many other regulations he doesn’t care for, including rules intended to reduce asthma-causing ozone pollutiontoxic mercury contamination in water supplies, and a requirement that state transportation departments monitor greenhouse gas emission levels on national highways and set targets for reducing them. Environmental nonprofit organizations and state attorneys general have had to sue, or threaten to sue, to stop this illegal behavior.

The EPA’s lawlessness is not confined to official acts, but also concerns the administrator personally. In an obvious conflict of interest, Mr. Pruitt played a leading role in the EPA’s proposed repeal of the Clean Power Plan, the nation’s first-ever limit on carbon dioxide pollution from power plants. Yet, just a few months before taking over at the EPA, Mr. Pruitt had led the legal fight against the rule as Oklahoma’s attorney general.

In effect, he played the role of advocate, then judge and jury, and ultimately executioner, all in a matter of a few months.

In addition, Administrator Pruitt is under investigation for misusing taxpayer dollars for $58,000 worth of private chartered flights, and has wasted $25,000 of taxpayer money to build himself a secret phone booth in his office.

Congress needs to ask Mr. Pruitt how he can be said to have restored respect for the law at the EPA, when the EPA (and perhaps Administrator Pruitt personally) have been flouting it. They need to ask him about what role he played in the proposed repeal of the Clean Power Plan, and how he can square his conflicting loyalties to the state of Oklahoma (which he represented as an attorney) and to the American people (who he is supposed to represent as head of the EPA). Congress should also investigate his personal use of taxpayer funds and his penchant for cutting corners on legally mandated processes.

An “Alice in Wonderland” approach to science

The EPA’s five decades of success rest on its longstanding commitment to the best available science, and to its well-trained professional scientists who deploy that science. Administrator Pruitt has taken a wrecking ball to this scientific foundation.

First, he ignores staff scientists when their conclusions do not support his deregulation agenda. On the crucial scientific question of our time—climate change and what is causing it—Mr. Pruitt says he does not believe carbon dioxide is a primary cause. Of course, this statement runs directly counter to the conclusions of EPA scientists (as well as those of the recently issued US Global Change Research Program Climate Science Special Report). And, in one of his first policy decisions, Administrator Pruitt overturned EPA scientists’ recommendation to ban a pesticide (chlorpyrifos) that presents a clear health risk to farmers, children, and rural families.

But Mr. Pruitt is not only ignoring staff scientists, he is also sidelining and suppressing advice from highly credentialed and respected scientists who advise the EPA. Last summer, he sacked most of the members of the Board of Scientific Counselors, a committee of leading scientific experts that advises the EPA about newly emerging environmental threats and the best use of federal research dollars. And he has used this as an excuse to suspend the board’s work indefinitely.

More recently, he issued a new policy which states that a key outside Science Advisory Board will no longer include academic scientists who have received EPA grants in the past, under the purported theory that the grants render them less objective. Yet, Administrator Pruitt will fill these posts with industry scientists who are paid exclusively by industry, and with scientists who work for state governments that receive grants from the EPA. This new policy has enabled Mr. Pruitt to fill these boards with scientists who are clearly aligned with industry, scientists such as Michael Honeycutt, who has railed against EPA limits on soot and even testified before Congress that “some studies even suggest PM [particulate matter] makes you live longer.”

Administrator Pruitt’s attack on science also includes the EPA deleting vital information from agency websites. For example, the EPA has deleted key information about the Clean Power Plan, even though the agency is in the middle of a public comment process on whether to repeal that rule, and what to replace it with. The EPA has also eliminated information on the “social cost of carbon” and the record of its finding that the emission of greenhouse gases endangers public health.

These deletions seem designed to make it more difficult for the scientific community, and members of the public, to access the scientific information that stands in the way of Mr. Pruitt’s agenda.

Congress needs to probe deeply on these multiple ways that Administrator Pruitt has diminished the role of science at EPA. Representatives and senators should make him explain why he thinks he knows more about climate science and the harms of pesticides than his scientists do. They should demand that he explain why it is a conflict of interest for academic scientists who receive EPA grants to advise the EPA, but not for state and tribal scientists who receive these grants, or industry-paid scientists. And Congress must find out why so much valuable information about climate science, the social cost of carbon, and other matters have vanished from EPA websites.

Making the world safe for polluters

In December 2015, more than 190 countries, including the United States, approved an agreement in Paris to finally tackle the greatest challenge of our time—runaway climate change. Donald Trump pledged to pull the United States out of this agreement when he ran for office, but for six months into his term, he did not act on the pledge, and there was an internal debate within his administration.

Mr. Pruitt led the charge for the US withdrawal from that agreement. He has followed up on this by going after almost every single rule the Obama administration had put in place to cut global warming emissions. This includes the proposed repeal of the Clean Power Plan, the “re-opening” of the current fuel economy standards that are now on target to roughly double cars’ fuel efficiency by 2025, the repeal of data gathering on methane emissions from oil and gas facilities, and tampering with how the EPA calculates the costs of carbon pollution, among many other actions.

But Administrator Pruitt’s rollback of safeguards is not limited to climate-related rules; it also includes cutting or undermining provisions that protect us all from more conventional pollutants. He has started the process of rescinding rules that limit power plants from discharging toxic metals such as arsenic, mercury and lead into public waterways; regulate the disposal of coal ash in waste pits near waterways; and improve safety at facilities housing dangerous chemicals.

The breadth and ferocity of these rollbacks is unprecedented. Congress needs to push back hard. For starters, representatives and senators need to demand that Mr. Pruitt explain how it fits within his job duties to lobby the president against one of the most important environmental protection agreements ever reached. Similarly, they need to highlight the impacts on human health and the environment from all of the rollbacks that Administrator Pruitt has initiated, and force him to explain how the EPA can be advancing its mission by lowering environmental standards.

Congressional oversight is needed now more than ever

Many aspects of Mr. Pruitt’s tenure are truly unprecedented. However, he’s not the first EPA administrator to display fundamental disrespect for the agency’s mission. As one legal scholar has noted, during the Reagan administration there were “pervasive” congressional concerns that former Administrator Anne Gorsuch and other political appointees at the agency “were entering into ‘sweetheart deals’ with industry, manipulating programs for partisan political ends, and crippling the agency through requests for budget reductions.”

Congressional oversight back then was potent: among other things, Congress demanded that the EPA hand over documents about the apparently lax enforcement of the Superfund law requiring cleanups of hazardous waste sites. When the EPA head refused to comply with those demands, Congress held Administrator Gorsuch in contempt. Senators, including Republicans such as Robert Stafford and Lincoln Chaffee, publicly voiced their alarm. Eventually, President Reagan decided Ms. Gorsuch was a liability, and he replaced her with William Ruckelshaus, EPA’s first administrator under President Nixon, and a well-respected moderate who stabilized the agency.

These oversight efforts were “the decisive factor in causing Ms. Gorsuch, as well as most of the other political appointees at the agency, to resign.”

It may be too much to expect that the current, polarized Congress will exhibit the same level of tough, bipartisan oversight it did in the Reagan era. Yet, bipartisan support for vigorous environmental protection remains strong today and some Republican leaders have already called upon Administrator Pruitt to step down. It is high time for Congress to do what it can to ensure that Mr. Pruitt’s EPA does not continue to put the interests of a few industries ahead of the clean air, water, and lands that the agency is mandated to protect.

Like Bonnie Tyler, NRC is Holding Out for a HERO

UCS Blog - All Things Nuclear (text only) -

In Nuclear Energy Activist Toolkit #47, I summarized the regulations and practices developed to handle emergencies at nuclear power plants. While that commentary primarily focused on the response at the stricken plant site, it did mention that nuclear workers are required to notify the Nuclear Regulatory Commission (NRC) promptly following any declaration of an emergency condition. The NRC staffs its Operations Center 24 hours a day, 365 days a year to receive and process emergency notifications.

In late September 2017, I was made aware that the NRC was not staffing its Operations Center with the number of qualified individuals as mandated by its procedures. Specifically, NRC Management Directive 8.2, “Incident Response Program,” dictates that the Operations Center be staffed with at least two individuals: one qualified as a Headquarters Operations Officer (HOO) and one qualified as a Headquarters Emergency Response Officer (HERO). The HOO is primarily responsible for responding to a nuclear plant emergency while the HERO provides administrative support such as interagency communications.

I learned that the NRC Operations Center was instead often being staffed with only one person qualified as a HOO and a second person tasked with a “life support” role. In other words, the “life support” person would summon help in case the HOO keeled over from a heart attack or spilt hot coffee on sensitive body parts.

Fig. 1 (Source: Joe Haupt Flickr photo)

I wrote to Bernard Stapleton, who heads the NRC’s incident response effort, on October 3, 2017, inquiring about the Operations Center staffing levels. The NRC’s response was both rapid and thorough.

A conference call was conducted on October 12, 2017, between me and Steve West, Acting Director of the NRC’s Office of Nuclear Security and Incident Response, and members of his staff, Bern Stapleton and Bo Pham. They informed me that it had been a challenge for the agency to staff the Operations Center in summer and fall 2017 with qualified HEROs due to several watch standers taking other positions within the NRC and a temporary hiring freeze imposed after the unanticipated termination of the construction of two new reactors at the Summer nuclear plant in South Carolina.

The former reason made sense as individuals with these skills seek promotions. The latter reason made sense as the NRC sought to find new positions for its staff members formerly assigned to the Summer project. The one-two punch of qualified persons leaving and the replacement pipeline being temporary shut off prevented the Operations Center from always being staffed with an individual HERO qualified. The Operations Center always had a HOO; it sometimes lacked a HERO.

They told me that two persons had recently been hired to fill the empty positions on the Operations Center staffing chart and those new hires would be undergoing training to achieve HERO qualifications. In addition, they told me about initiatives to qualify NRC staff outside of the Operations Center section to provide a larger cushion against future staffing challenges. The larger pool of qualified watch standers would have the collateral benefit of expanding the skill sets of individuals not assigned full-time to the Operations Center.

The NRC followed up on the conference call by sending me a letter dated November 16, 2017, documenting our conversation.

UCS Perspective

It would be better for everyone if the NRC had always been able to staff its Operations Center with individuals qualified as HOOs and HEROs. But the downside from problem-free conditions is the challenge in determining whether they are due more to luck than skill. How an organization responds to problems often provides more meaningful insights than a period of problem-free performance. On the other hand, an organization really, really good at responding to problems might reflect way too much experience having problems.

In this case, the NRC did not attempt to downplay or excuse the Operations Center staffing problems. Instead, they explained how the problems came about, what measures were being taken in the interim period, and what steps were planned to resolve the matter in the long term.

In other words, the NRC skillfully responded to the bad luck that had left the Operations Center short-handed for a while.

The EPA Knows Glider Trucks Are Dangerously Dirty: It’s Time to Keep Them Off the Road

UCS Blog - The Equation (text only) -

That shiny new truck could have a 15-year-old engine that doesn’t meet today’s standards. Photo: Jeremy Rempel. CC-BY-ND 2.0 (Flickr)

Today, I am speaking at a public hearing at EPA to push back on the agency reopening a “zombie truck” loophole. I wrote about the political motivations behind the attack on public health previously, but we now have even more information about exactly how dirty these trucks are from an interesting source: the EPA itself.

A reminder about what is at stake

Glider vehicles are brand new trucks that are powered by a re-manufactured engine.  While they look like every other new truck on the outside, on the inside they have engines which were manufactured under weaker pollution standards than other new trucks. Because they are resurrecting these older, more highly polluting engines from the dead, they are sometimes referred to as “zombie trucks.”

While initially glider trucks were used to replace vehicles whose bodies had been damaged, more recently a cottage industry has sprung up selling about 20 times more trucks than historic levels solely to bypass pollution restrictions.

In the “Phase II” heavy-duty vehicle regulations, the EPA closed the loophole that allowed these awful pollution spewers to be manufactured in the first place. However, Scott Pruitt’s EPA has proposed repealing this action, reopening the loophole primarily to benefit a company with political ties.

Dirty science for dirty trucks

In support of this repeal, Fitzgerald Trucks (the manufacturer requesting the loophole be reopened) submitted the results of a slapdash series of tests it claimed were from independent researchers.  However, the tests were paid for by Fitzgerald and conducted using Fitzgerald’s equipment in Fitzgerald’s facilities.  The results of the tests were incomplete and indicated that the work was sub-standard. However, we didn’t know just how unscientific the research was until EPA technical staff posted a memo detailing a meeting with the researchers.  Here are just a few of the absurd shortcomings in the tests:

  • Researchers did not use industry standard test procedure, so any numerical results could not be directly compared with regulatory requirements or literally any other research in the technical literature.
  • Researchers did not actually take samples of soot during testing, despite the fact that this is not just carcinogenic but one of the specific pollutants at issue with these engines which causes such detrimental health impacts.  Instead, they “visibly inspected” the test probe. Yes, you read that right–they just looked at it to see if it was dirty.
  • Researchers did not test under any “cold start” conditions. Like when you first turn on your car, this is when the engine emits elevated levels of pollution, which is why it is a standard part of regulatory tests for both cars and trucks.

Believe me when I tell you that I could not get my doctorate if my lab work were of that low quality.

Ignoring the EPA’s own technical data

While pointing to the subpar Fitzgerald / Tennessee Tech data, the EPA was actually aware of much higher quality data being done at its own facilities.  Instead of waiting for these tests to be completed, the politicos at EPA moved forward with the proposed repeal anyway.

Well, the results from those tests are in, and they are at least as bad as the EPA’s technical staff feared.  In fact, it may be even worse:

  • According to the test results, it appears that these engines actually exceed the legal limits they were initially designed for.  This means that the “special programming” of the engine Fitzgerald claims to do to the engines may result in greater fuel economy, but it means greater pollution, too.
  • The soot exhausted by these engines is so large that it caused a fault in the EPA’s equipment, after which the EPA had to adjust the throughput.  A good comparison to this is like when you have your volume adjusted for a TV program you like and then suddenly a really loud commercial comes on…except now imagine that commercial just blew out your speakers.

  • The two collectors on the left of this image are what happened when they first tried to collect the pollution from these vehicles; the two collectors on the right are what it looked like before the test.  Now imagine what that experience must be like for the lungs of a child with asthma.

The EPA had already projected that every year of production of glider vehicles at today’s levels would result in as many as 1600 premature deaths–this new data suggests that number could be even higher.

The science is clear, so closing this loophole should be the easy thing to do.

I am speaking today at the hearing against because I want to make sure EPA listens to its own scientists and closes this loophole, to abide by its mission statement and protect human health and the environment.  And today I will be among a chorus of dedicated citizens reminding the agency of its mission.

EPA

Chinese Military Strategy: A Work in Progress

UCS Blog - All Things Nuclear (text only) -

Chinese President Xi Jinping, also general secretary of the Communist Party of China (CPC) Central Committee and chairman of the Central Military Commission (CMC), presents the heads of the People’s Liberation Army (PLA) Academy of Military Science with the military flag in Beijing, capital of China, July 19, 2017. (Xinhua/Li Gang)

Several years ago UCS reported China could put its nuclear weapons on high alert so they could be launched on warning of an incoming attack. Last week I had the opportunity to speak with some of the authors of The Science of Military Strategy: the authoritative Chinese military publication that was the source of the information in our report.

In a lively discussion, most of which took place between the authors themselves, I was able to confirm our original report is accurate. But I also learned more about how and why The Science of Military Strategy was written and what that can tell US observers about the broader context of how military thinking is evolving in China.

What it means to say China “can” launch on warning.

As of today, China keeps its nuclear forces off alert. The warheads and the missiles are separated and controlled by different commands. The operators are trained to bring them together and prepare them for launch after being attacked first.

China’s nuclear arsenal is small. Reliable estimates of the amount of weapons-grade plutonium China produced and the amount of plutonium China uses in its warheads tell us China has, at most, several hundred nuclear warheads. It has even fewer long-range missiles that could deliver those warheads to targets in the United States.

Because China’s nuclear arsenal is small and kept off alert some Chinese military strategists worry it could be completely wiped out in a single attack. Their US counterparts have told them, in person, that the United States will not rule out attempting a preemptive strike at the beginning of a war. The question for Chinese strategists is whether or not they should do something to mitigate this vulnerability. Many believe the risk of a major war with the United States is low and the risk of a nuclear war is even lower.

For Chinese strategists who don’t share that optimism, there are two basic ways to address their vulnerability. The first would be to significantly increase the size of China’s forces. Chinese nuclear weapons experts told me that would require a lot of time and considerable effort. They would need to resume producing plutonium for weapons and may also need to resume nuclear testing. The economic costs would be considerable. The diplomatic costs would be even greater.

The second way to avoid the risk of allowing an adversary to think they can wipe out China’s nuclear force with a preemptive strike is for China to put its forces on alert and enable them to be launched on warning of an incoming attack. That would require the development of an early warning system. It may also require upgrading China’s nuclear-capable missiles. One Chinese missile engineer explained that China’s existing missiles are not designed to be kept on continuous alert.

Either option would significantly alter China’s nuclear posture. But the latter may also require a consequential change in China’s nuclear doctrine.

China’s political leaders promised the world they would never, under any circumstances, be the first to use nuclear weapons. Wouldn’t launching on warning of attack, before any damage is done, violate that promise? The answer is not as obvious to Chinese policy-makers as it probably seems to their American counterparts, who don’t believe in the efficacy or credibility of a no first use pledge in the first place.

What I learned in my conversation with the authors of The Science of Military Strategy is that when they wrote that China “can” launch on warning of an incoming attack they were not saying China has the technical capability to do so,  nor were they announcing the intention to implement a launch on warning policy. They were simply declaring that, in their view, China could launch on warning—before their missiles were destroyed—without violating China’s no first use pledge.

Shouldn’t they have made that more explicit?

The authors told me, in response to a direct question, that they did not consider the impact of what they were writing on external audiences. That does not mean they were unaware non-Chinese might read it, just that they weren’t writing for them. The Science of Military Strategy is  an institutional assessment of China’s current strategic situation prepared for the consideration of the rest of China’s defense establishment and its political leadership. Those two audiences wouldn’t need to be told what the “can” in an Academy of Military Science (AMS) statement on launch on warning was referencing. They would already understand the context. As the authors explained, AMS is not responsible for making technical assessments of China’s capabilities, nor does it make public announcements about Chinese military policies or the intentions of China’s political leadership.

It’s difficult for many US observers to imagine that Chinese open source publications like The Science of Military Strategy aren’t just another form of Chinese Communist Party (CCP) propaganda. That’s understandable given Chinese government controls on speech and publication. But even in a relatively closed and tightly controlled polity like China’s, professionals still need to engage in meaningful discussion, including military professionals. Understanding that internal discussion from abroad requires more than parsing the language in Chinese publications. It also requires a sufficient degree of familiarity with the social, institutional and sometimes even the personal factors that define the context within which Chinese discussions of controversial topics – like nuclear weapons policy – take place.

Regular interaction with Chinese counterparts is the only way to acquire this familiarity. Unfortunately, both governments make that much more difficult than it needs to be. And language is still a significant barrier, especially on the US side.

Pessimism on US-China Relations

Most of my Chinese colleagues believe the intergovernmental relationship between China and the United States is deteriorating. The cooperative relationship of the 1980s and 1990s gradually gave way to an increasingly competitive relationship over the past two US administrations. The new edition of The Science of Military Strategy, composed over an 18-month period prior to its publication in 2013, addresses new issues that might emerge if this trend continues, and the relationship moves from competition toward conflict.

There is no fixed schedule for putting out a new edition. According to a general who was also involved the production of two prior editions, the first addressed concerns related to China-USSR relations. The second responded to the so-called “revolution in military affairs” exemplified by the new technologies used in the 1991 Gulf War. The current edition had no equally specific point of origin. It was, in the Chinese general’s words, more “forward-looking.” And as the Chinese military looks forward, its relationship with the United States looms large on the horizon.

None of the authors felt China’s overall military capabilities were remotely comparable to those of the United States. One of the more interesting barometers they used was the average annual salary of an ordinary soldier. All of the authors agreed this gap is unlikely to be closed in the foreseeable future. China still needs to focus its military development in select areas. Having a clearer understanding of what China’s future military challenges might be—an understanding AMS is charged with articulating—can help Chinese decision-makers set priorities.

That one of those priorities is addressing the vulnerability of China’s nuclear forces to a US preemptive attack is a troubling indicator of deteriorating relations.

 

Vehicle Fuel Economy Standards—Under Fire?

UCS Blog - The Equation (text only) -

Photo: Staff Sgt. Jason Colbert, US Air Force

Last year, transportation became the sector with the largest CO2 emissions in the United States. While the electricity industry has experienced a decline in CO2 emissions since 2008 because of a shift from coal to natural gas and renewables, an equivalent turnaround has not yet occurred in transportation. Reducing emissions in this sector is critical to avoiding the effects of extreme climate change, and the Corporate Average Fuel Economy (CAFE) and Greenhouse Gas (GHG) emissions standards are an important mechanism to do so.

The most recent vehicle standards, which were issued in 2012, are currently undergoing a review. The Department of Transportation (DOT) is initiating a rulemaking process to set fuel economy standards for vehicle model years 2022-2025. At the same time, DOT is also taking comments on its entire policy roster to evaluate their continued necessity (including the CAFE standards).

A number of criticisms have been raised about fuel efficiency standards, some of which are based more in confusion and misinformation than fact. An intelligent debate about the policy depends on separating false criticisms from those that are uncertain and those that are justified.

In fact, as new research I did with Meredith Fowlie of UC Berkeley and Steven Skerlos of University of Michigan shows, the costs of the standards could actually be significantly lower than other policy analyses have found.

Costs and benefits of the regulations

What my co-authors and I have found is that automakers can respond to the standards in ways that lower the costs and increase the benefits.

Many policy analyses do not account for the tradeoffs that automakers can make between fuel economy and other aspects of vehicle performance, particularly acceleration. We studied the role that these tradeoffs play in automaker responses to the regulations and found that, once they are considered, the costs to consumers and producers were about 40% lower, and reductions in fuel use and GHG emissions were many times higher.

The study finds that the fact that automakers can tradeoff fuel economy and acceleration makes both consumers and producers better off. A large percentage of consumers care more about paying relatively lower prices for vehicles than having faster acceleration. Selling relatively cheaper, more fuel-efficient vehicles with slightly lower acceleration rates to those consumers allows manufacturers to meet the standards with significantly lower profit losses. Consumers that are willing to pay for better acceleration can still buy fast cars.

Debunking some common criticisms

One common criticism is that the regulations mandate fuel economy levels that far exceed any vehicles today. This misconception stems from the frequently quoted figure when the regulations were first issued that they would require 54.5 mpg by 2025. But, the regulations do not actually mandate any fixed level of fuel economy in any year. The fuel-economy standards depend on the types of vehicles that are produced each year. If demand for large vehicles is up, the standards become more lenient; if more small vehicles are sold, they become more strict. The 54.5 mpg number was originally estimated by EPA and DOT in 2012 when gas prices were high. EPA has since revised it to 51.4 mpg to reflect lower gas prices and higher sales of large vehicles. Taking into account flexibilities provided in the regulations and the fact that this number is based on EPA’s lab tests, which yield higher fuel economy than drivers experience on the road, the average target for 2025 is equivalent to approximately 36 mpg on the road. Fueleconomy.gov lists 20 different vehicle models that get at least this fuel economy today.

Another common but unjustified criticism of the standards is that they push consumers into small vehicles. The regulations were specifically designed to reduce any incentive for automakers to make vehicles smaller. The standards are set on a sliding scale of targets for fuel economy and GHG emissions that depend on the sizes of the vehicles. As a result, an automaker that sells larger vehicles has less stringent fuel economy and emissions targets than one that sells smaller vehicles. Research has shown that the policy likely creates an incentive for automakers to produce bigger vehicles, not smaller.

Two easy ways to strengthen the fuel economy standards

There are, of course, advantages and drawbacks to any policy, including today’s vehicle standards, which focus entirely on improving the efficiency of new vehicles.  Fortunately, there are improvements that can be made to the CAFE and GHG regulations to increase their effectiveness and lower costs.

The first is ensuring that automakers that violate the standards pay very high penalties. Companies who cheat steal market share from those that follow the standards, effectively raising the regulatory costs for the automakers that are playing fair.

The second improvement involves the way automakers are able to trade “credits” with each other.  These credits were created to equalize regulatory costs across companies. So, if one automaker finds it relatively easy to reduce emissions, it can reduce more than its share and sell credits to another automaker having trouble reducing emissions. This trading is currently negotiated individually by each pair of automakers, which raises the costs of the transaction. Creating a transparent market to trade these credits would help to achieve the target emission reductions at lower costs.

The Department of Transportation (DOT), which implements the Corporate Average Fuel Economy (CAFE) standards, is currently soliciting comments on regulations “that are good candidates for repeal, replacement, suspension, or modification.” The comment period ends December 1.

 

Dr. Kate Whitefoot is an Assistant Professor of Mechanical Engineering and Engineering and Public Policy at Carnegie Mellon University. She is a member of the NextManufacturing Center for additive manufacturing research and a Faculty Affiliate at the Carnegie Mellon Scott Institute for Energy Innovation. Professor Whitefoot’s research bridges engineering design theory and analysis with that of economics to inform the design and manufacture of products and processes for improved adoption in the marketplace. Her research interests include sustainable transportation and manufacturing systems, the influence of innovation and technology policies on engineering design and production, product lifecycle systems optimization, and automation with human-machine teaming. Prior to her current position, she served as a Senior Program Officer and the Robert A. Pritzker fellow at the National Academy of Engineering where she directed the Academy’s Manufacturing, Design, and Innovation program.

 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Abnormal and Catastrophic 2017 Hurricane Season Finally Over

UCS Blog - The Equation (text only) -

The official end of the 2017 North Atlantic hurricane season, November 30th, has finally arrived.  In the season’s wake many are mourning the loss of loved ones, repairing their homes, and still waiting for electricity to return.

Hurricane Tracks 2017

Figure 1. North Atlantic hurricanes and tropical storm tracks of the 2017 Season. Preliminary as the November storm tracks are not yet updated.

2017 North Atlantic Hurricane season was not normal

The first named storm of the 2017 Hurricane Season, tropical storm Arlene, began in early April.  Harvey, Irma, and Maria are the names communities will remember long after they became major hurricanes.

Six of the ten hurricanes were major (greater than category 3).  Recalling the headlines and seeing the damages, the season was catastrophic (Figure 1).  Crunching the numbers on a measure of power – Accumulated Cyclone Energy (ACE) – confirms that impression.  September 2017 ACE was more than three times greater than historical September ACE average over 1981-2000.  Scientists are piecing together the factors that contributed to such an intense hurricane season.  Attribution studies (studies that attribute the relative role of human and natural factors in the occurrence of extreme weather) are already been published about a specific Hurricane from 2017.

Some extraordinary conditions of this hurricanes season:

Hurricane Ophelia SST Anomalies Oct 2017

Figure 2. Warmer than 1985-2012 average sea surface temperatures (SSTs) during the time when tropical storm Ophelia transitioned into a hurricane south of the Azores Islands.

Warmer Seas –  A big factor contributing to the intensification of Harvey, Irma, and Maria was the warmer than average sea surface temperature (SST) conditions.  Another surprising consequence of the warmer than average SSTs was the hurricane region extended beyond typical hurricane regions of the North Atlantic Ocean.  This allowed hurricane Ophelia to thrive in highly unusual latitudes and longitudes making it the easternmost hurricane to date (see storm track number 15 in Figure 1).  The extratropical storm Ophelia made landfall in Ireland and brought waves that battered the UK coast, drenched northern Europe and blew winds that fueled lethal wildfires in southern Europe.  Research suggests that heat-trapping emissions can extend the SST region favorable for hurricanes and increase the chances for these storms to head toward western Europe.

Figure 3. Record-breaking precipitation dropped along the Texas and Louisiana coastal region.

Record Breaking Precipitation – Hurricane Harvey dropped a whopping 60 inches in Nederland, Texas, east of Houston, breaking the 1950-2017 record for state maximum precipitation from tropical cyclones and their remnants. Hurricane Harvey’s average accumulated rainfall over Houston (840 mm or 33 inches) was exceptional.  There was so much floodwater in Houston that it sunk the landscape by 2 centimeters (~0.8 inch) in some places.  Assuming the precipitation area of individual hurricanes remains similar, Kerry Emanuel found that the return period for a greater than 500 mm (19.7 inches) average accumulated event precipitation was a once in a 100-year event over 1981-2000.  This becomes a once in 16-years event by 2017 and a once in 5.5-years occurrence by the end of this century under an unabated emissions scenario.

Catastrophic Wind –   A hurricane category is defined by sustained winds and the associated consequences are described by words such as “devastating damage” for category 3 and “catastrophic damage” for categories 4 and 5.   Hurricanes Irma and Maria had unusually high peak 1-min sustained winds making the rank of North Atlantic Hurricanes with the strongest winds in the historical record (see table).  Those on the ground during landfall withstood ferocious winds.  Hurricane Maria was the first category 5 (157 miles per hour or higher sustained winds) hurricane to make landfall in Dominica—a small Caribbean Island south east of Puerto Rico. It made landfall yet again, but this time as a category 4 (130-156 miles per hour sustained winds), in Puerto Rico.  Similarly, hurricanes Harvey and Irma made landfall as a category 4 storm in Texas and Florida, respectively.

 

How does an abnormal hurricane season become disastrous?

The Intergovernmental Panel on Climate Change has pointed to three major factors that can combine to influence the risk of an extreme event disaster.  A weather and climate event plus communities exposed to that event plus the social vulnerability all combine to influence disaster risk.

Social vulnerability refers to the resilience of communities when confronted by external stresses.  A few examples follow regarding the exposure and social vulnerability that intersected with hurricanes that are changing in a warming world.

Many Caribbean residents were among those exposed to these powerful hurricanes, which made repeated landfall on numerous islands this year.

For over three centuries people have lived on Barbuda and for the first time the risk was so grave that the entire island population fled to avoid exposure to Hurricane Irma.  People are confronting rebuilding a community and civilization on Barbuda.

It is estimated that 3.7 million people, over a million households, nearly 260 billion dollars in Puerto Rico were exposed to wind impacts from Hurricane Maria. The fourth most populated U.S. City in 2017, Houston, was exposed to the precipitation deluge (see Figure 3) from Hurricane Harvey.

An entire metropolitan region, island or county might be exposed to an abnormal hurricane season, but not all of those in the path of a storm are equally vulnerable to its effects.

Differences in vulnerability have already emerged in the aftermath of the 2017 season that reflect in part the history of a place, people, and infrastructure.  Additional factors include communication about the hurricane risks, first response and long-term disaster management. For example, elderly people perished in a Florida nursing home after days of being stuck in sweltering heat following the power outage caused by Hurricane Irma.

The U.S. Health Assessment found that people with chronic medical conditions are more likely to have serious health problem during excessive heat than healthy people.   The elderly in this case depended on others for their care.  As the USA Today Editorial Board put it “In such a climate, air conditioning is not a luxury for elderly people; it’s a necessity.”

Tragic loss of life from Hurricane Maria in Puerto Rico is estimated to be similar to Hurricane Katrina. This large toll is in part due to the vast numbers of  U.S. citizens and residents  still suffering from lack of safe drinking water or do not have access to power in Puerto Rico.

Families are piecing together their lives after a devastating loss of a family member. Or the absence of a child who had to evacuate to continue school in a safer place during a protracted recovery period.

2017 is the most expensive Atlantic Hurricane Season to date with damages already racking up over $200 billion.   The epic disasters of the 2017 hurricane season hold important lessons, which should be taken into account when planning steps to better protect lives from Hurricanes and their aftermath.  In turn those who are recovering from the disastrous hurricane season can learn from those lessons already learned from Hurricanes Sandy, Katrina, and Andrew.

These lessons can help communities rebuild toward climate resilience with principles that are scientifically sound, socially just, fiscally sensible, and adequately ambitious.

NOAA Climate.gov NOAA National Weather Service NOAA tweet http://bit.ly/2AkUySt Brenda Ekwurzel created with NASA, U.S. Air National Guard photo by Staff Sgt. D.J. Martinez, U.S. Air Force, and U.S. Dept of Homeland Security images

Virginia’s Gerrymander Is Still Alive—and a Deadly Threat to Environmental Justice

UCS Blog - The Equation (text only) -

This week, Virginia’s Board of Elections certified results from the November 7th elections, paving the way for three crucial recounts that will determine control of the Virginia House. The Democratic Party would need to take two of those seats for a majority, having already defeated more than a dozen incumbent Republicans and flipping three seats. If this wave is enough to push the Democratic Party over the 50-seat mark, many in the press will declare that the Virginia GOP’s gerrymandered districting plan is no more. But they will be wrong. The value of some Virginians’ votes are still diluted, as they were before the election. In turn, voting inequalities continue to bias the legislature’s responsiveness to environmental and health threats.

Virginia’s gerrymander has proven durable over the decade. Majorities of voters have supported the Democratic Party over the last four election cycles, only to win about one third of legislative seats. This bulwark against majority rule was engineered after the 2010 Census, by an incumbent party with absolute control over redistricting the assembly. Despite earning a substantial (nine-percent) majority of votes over incumbent Republicans this year, Democrats still have less than a 50/50 chance of gaining majority control, and if they do it will be by one seat. The fact that there is any uncertainty over whether a party with a near 10-point majority vote will control the chamber is proof of just how durable the gerrymander is. What happened on November 7th in Virginia was near historic, but it did not breach the gerrymander.

2017 Democratic district vote shares (blue), sorted by 2015 Republican vote shares (red). Democratic vote shares in 2015 uncontested GOP districts are sorted by 2017 Democratic vote share.

Democratic voters wasted far more votes in uncontested safe districts, 26 in fact, compared to 11 overwhelmingly Republican districts where Democrats did not field candidates. This is illustrated in the graphic below with full blue bars (left), indicating uncontested Democratic seats, and bars that are filled red with no blue, uncontested Republican seats.  While Democrats tend to reside in higher density, urban regions, one of the most powerful gerrymandering tactics is to pack opposition voters into districts so that their surplus votes (over 50%) are wasted. This year, extensive mobilization efforts, coupled with a Gubernatorial campaign tainted with racist overtones, provided the bump that Democrats needed in the most competitive districts (around the 50% mark). The middle of the graph depicts the contests where Democrats reached 50% or higher, reaching into the competitive districts held by GOP incumbents (and several open seats).

In districts that were contested in both cycles, Democratic candidates gained an average of 9.6 points (with a 5-point standard deviation). Democrats also contested far more districts than in 2015 (the solid red area with blue bars), picking off several seats against incumbents where they had not previously fielded candidates. Had the wave reached into districts where Republicans typically win by 15-20 points, we would have seen the type of gerrymander backfiring that occurred in Congress in the late 1800’s. In 1894, for example, a vote shift of less than 10 points against the Democratic Party cost them more than 50% of their seats, the largest loss in Congressional history.

The Democratic wave was enough to sweep away the GOP’s supermajority, but not enough to reverse the tides. Unless the Democratic Party can repeat their impressive turnout effort in 2019, it will be impossible to hold on to those marginal seats. Of course, under a fair system, a party with a nine-point statewide lead would have a cushion of several seats for close legislative votes. Even if Democrats do gain control, that one seat majority is vulnerable to being picked apart by the same powerful actors that helped engineer this electoral malpractice in the first place, at a great cost to Virginians.

Probably the single most powerful player is Dominion Energy. Consistently one of the largest donors to state election campaigns, Dominion greatly benefitted from a gerrymander engineered in large part by one of its biggest supporters, Appropriations Chair S. Chris Jones. Since 2011, Dominion has been remarkably successful at pushing through a rate freeze law that allowed it to hold on to over $100 million it would have paid back to customers, limiting the growth of clean energy technologies like solar power, and avoiding regulatory oversight of the toxic pollutants that it dumps into Virginia waterways. Remarkable enough that several of the successful Democratic challengers in this election made Dominion’s political influence central to their campaigns, refusing to accept their contributions.

The Dominion rate freeze passed the VA House on a 72-24 vote, so it’s not clear that even a fair districting plan would have stopped it, but it definitely would have changed the terms of negotiation. And because it has still insulated the legislature from an accurate representation of public support, the Virginia gerrymander weakens voters’ ability to protect themselves against current and impending health threats. For example, measured by the amount of toxic chemicals discharged into them, Virginia’s waterways are among the worst in the nation. Hundreds of companies are allowed to legally discharge toxins into waters upstream from recreational places where people regularly swim and fish. Arsenic levels up to 400 times greater than what is safe for residential soil have been measured along the James River.

Dan River coal ash spill. Photo: Appalachian Voices

According to a University of Richmond study, eight coal ash disposal sites along major rivers are significant hazards to nearby communities. Yet Virginia’s legislative oversight and regulatory programs are “bare boned and fragmented”, with utilities failing to provide adequate information about the amount, condition and stability of toxic chemicals and containment.

Nor do Virginians bear this burden equally. 76 percent of Virginia’s coal-fired plants are located in low-income communities or communities of color, including Possum Point, Spruance Genco and the Clover Power Station. Cumulative chemical exposure in such communities increases the risk of cancer, lung, and neurological diseases. The cancer rate in rural Appalachian Virginia is 15% higher than the national average, reflecting both environmental threats and lack of access to health care.  Earlier this year, an effort to expand Medicaid was killed on a party-line vote.

And as the impact of climate change becomes more pronounced, Virginia is on the front lines. A UCS analysis of the impact of tidal flooding showed that cities like Norfolk could see four times the frequency of flooding by 2030, while they already spend $6 million a year on road improvement, drainage and raising buildings. In places like Hampton Roads, sea level has already risen by more than a foot over the last 80 years. Yet members of the Virginia House, entrenched in power, continue to deny even the existence of sea level rise. Unfortunately, even a gerrymander as durable as Virginia’s cannot stop actual rising tides.

For their own safety, and the future of the Commonwealth, Virginians must continue the fight to have their full voting rights restored. Many are already suffering, and many more will pay a heavy price for policies that are unresponsive to public needs. Political equality and the integrity of the electoral process are prerequisites to evidence-based policy making that is in the public interest.

More Electric Vehicle Infrastructure Coming to Massachusetts

UCS Blog - The Equation (text only) -

Massachusetts Department of Public Utilities today approved a proposed $45 million investment in electric vehicle charging infrastructure.

The investments in electric vehicle infrastructure come as part of a complicated rate case that involves a number of important issues related to rate design, energy efficiency and solar energy. But at least on the electric vehicle part, the utilities and the DPU got it right.

Why do we need more investments in electric vehicle infrastructure?

Electric vehicles are a critical part of Massachusetts’ climate and transportation future. Under Massachusetts’ signature climate law, the Global Warming Solutions Act, the state is legally required to reduce our emissions of global warming pollution by 80 percent by 2050.

Transportation is the largest source of pollution in Massachusetts, and it’s the one area of our economy where emissions have actually grown since 1990. Achieving our climate limits will require the near-complete transition of our vehicle fleet to electric vehicles or other zero-emission vehicle technologies.

The good news is electric vehicles are here, they are fun to drive and cheap to charge, and when plugged in to the relatively clean New England grid, they get the emissions equivalent of a 100 mpg conventional vehicle. EV drivers in the Boston area can save over $500 per year in reduced fuel costs. Electric vehicle technology has advanced to the point where mainstream automakers and countries like China and France are now openly talking about the end of internal combustion engine.

But while the future for EVs is bright, electric vehicles are still a very small share of the overall vehicle fleet. Nationally, EVs represent less than half of one percent of new vehicle sales. In 2012, Massachusetts committed to a goal of putting 300,000 electric vehicles on the road by 2025. Five years later, we are still about 288,000 EV sales short of that goal.

What investments are coming?

One of the biggest challenges facing the growth of electric vehicles is limited infrastructure. People are not going to buy an EV if they don’t know where to plug it in. A survey of Northeast residents conducted last year found that limited access to charging infrastructure is one of the biggest obstacles to EV purchases.

We have had over a hundred years – and billions in public subsidies – to build the infrastructure of refineries, pipelines, and gas stations that service the internal combustion engine. New investments in charging infrastructure are critical to making EVs as convenient as filling up at a gas station.

Today’s decision will speed the transition to electric vehicles by making investments in charging infrastructure. These investments include more funding for infrastructure for people who live in apartment buildings, more fast charging infrastructure along highways, and increasing charging infrastructure in low income communities, and greater access to workplace charging.

Overall, the proposal anticipates the construction of 72 fast-charging stations and 3,955 “Level-2” home and workplace charging ports over the next 5 years. Of those charging ports 10 percent will be in low income communities, where utilities will also provide consumers with a rebate for charging stations. These investments will provide thousands of Massachusetts residents with access to EV charging stations.

The DPU did deny Eversource the right to use ratepayer funds for education and outreach. This is unfortunate, as our survey also found that most Northeast residents are not aware of the many incentives available for EV customers, both here in the Northeast and at the federal level.

What more needs to be done?

One big question that is left out of the decision today: how do we best manage EV charging to maximize the potential benefits to the electric grid.

The key issue is when does EV charging take place? If most people charge their EVs at night, or during times of high production of renewable electricity, then the transition to electric vehicles can make our electric system more efficient and speed the transition to renewables. This will mean significant cost savings.

On the other hand, if EV charging mostly happens during “peak” hours (such as morning and early evening), then adding more EVs onto the grid could strain existing electricity infrastructure and require additional investments in pipelines and power plants. This would both raise emissions and cost ratepayers money.

There’s a simple way to address this issue: provide a financial incentive for EV drivers to charge their vehicles during periods of low demand, a policy known as Time of Use Rates. The DPU decision today punts this issue, accepting the utility position that it will take time and additional data to determine how to best implement TOU rates. While we agree with the DPU that the most important priority is to get the charging infrastructure installed, this is an issue that we and others in the clean transportation community will be watching closely over the next few years.

Photo: Steve Fecht/General Motors

Great Lakes’ Great Changes: Temperatures Soar as the Climate Changes

UCS Blog - The Equation (text only) -

Grand Haven pier extends into Lake Michigan, where average summer surface temperatures have risen markedly over recent decades. Photo: Rachel Kramer/Flickr

Lake Michigan is not yet a hot tub, but the warming of this Great Lake gives you much to sweat about.

In his office at the University of Wisconsin Milwaukee, Paul Roebber, a Distinguished Professor in atmospheric sciences and a former editor of the journal Weather and Forecasting, showed me his most recent climate change lecture slides. The most arresting graphics compare current surface water temperatures of the Great Lakes with those three and a half decades ago. The average summer surface temperatures have risen 8 degrees Fahrenheit since 1980.

Particularly stark was Roebber pointing out a spot where a monitoring buoy floats way out in the middle of 100-mile-wide Lake Michigan, at a latitude between Milwaukee and Chicago. Two decades ago, average mid-July to September surface water temperatures in southern Lake Michigan ranged between 61 and 71 degrees. In 2016, they ranged between 67 and 77 degrees. On three separate days in 2016, temperatures hit 80. Surface water temperature changes near Milwaukee and Chicago were just as remarkable. On August 1, 1992, surface water temperatures were 61 and 65 degrees, respectively. On August 1, 2010, both were in the mid-70s.

“We’re starting to talk bath tub water and that is saying something about the changes,” Roebber said.

The future is almost unthinkable

Roebber’s comments certainly say something to me as a native of Milwaukee. I have vivid memories of childhood winters a half-century ago. We first- and second-graders were so acclimated to consecutive subzero days that when the high was 5 above, we’d walk to school with our coats flying open unzipped.

“We’re starting to talk bath tub water and that is saying something about the changes.” Atmospheric sciences professor Paul Roebber, University of Wisconsin.

Today, scientists predict a future climate unthinkable for a region where Green Bay Packers fans romanticize their home-team advantage in a stadium nicknamed the Frozen Tundra.

Roebber said that the modern lake warming has occurred with a rise of only a single degree in the air temperature over the Great Lakes over the last 30 years. But air temperatures are about to soar in scenarios where little or nothing is done to fight climate change. Researchers all around the Great Lakes and analysts at the Union of Concerned Scientists predict that the average summer highs of Milwaukee, currently about 80 degrees, could rise as high as 92 over this century.

The UCS analysis predicted that by 2100, Milwaukee would have nearly two months’ worth of days 90 degrees or higher, including three weeks’ worth of 100-degree scorchers. There would be at least one heat wave a summer with the sustained oppressive temperatures that killed hundreds of people in Chicago in 1995. Overall air quality would deteriorate as well, exacerbating asthma and other respiratory conditions.

In fact, the Upper Midwest region—including Milwaukee, Chicago, and Minneapolis—could collectively experience regular deadly heat waves with temperatures on the same scale that killed an estimated 70,000 people across Europe in 2003. “Under the higher-emissions scenario a heat wave of this magnitude would occur at least every fifth year by mid-century and every other year toward the end of the century,” the UCS analysis concluded.

 Under worst-case scenarios, northern Illinois will have the climate of Dallas and southern Illinois will have the temperatures of Houston by the end of this century. As for Illinois’ neighbor to the north, Roebber notes, “Our climate in Wisconsin will look like Arkansas.”

Change is underway in the world’s largest surface freshwater system

It’s scary to contemplate what Lake Michigan could be compared to a century from now. The five Great Lakes comprise the world’s largest surface freshwater system, in a basin serving 30 million people. While many long-range projections of climate change along America’s eastern seaboard focus on chronic inundation from rising ocean levels, the lakes offer a different set of perplexing dilemmas.

Perhaps most perplexing is the year-to-year unpredictability of conditions. The general scenario of recent decades has been less ice cover in winter, which has allowed more water to evaporate and resulted in unprecedented low lake levels. But there can also be years where that trend is punctuated by ice-choked Great Lakes as the warming Arctic ironically creates a wavier jet stream.

The overall long-term trends, according to the University of Wisconsin Sea Grant Institute, point to all the bodies of water in the state being at risk.

“Longer, hotter, drier summers and increasing evaporation will result in warmer and shallower rivers, shrinking wetlands, and dried-up streams, flowages and wild rice beds,” the institute says. “Algal blooms will create anoxic conditions for aquatic life in ponds and many lakes.”

“These conditions will reduce the amount of suitable habitat available for trout and other cold-water fishes, amphibians and waterfowl. A two-degree rise in temperature could wipe out half of Wisconsin’s 2,700 trout streams. Hot dry conditions, coupled with more frequent thunderstorms and lightning, will increase the chance of forest fires. Red pine, aspen and spruce trees will disappear from our northern forests.”

A joint report by the University of Wisconsin and the state’s Department of Natural Resources predicts more climate-change losers than winners among fauna. As populations of European starlings, Canada goose, and gray squirrels grow, those of the purple martin, black tern, American marten, common loons, and various species of salamanders, frogs, and prairie birds may decline or disappear.

“This will result in a net loss to the state’s biodiversity and a simplification of our ecological communities,” the report said.

As for commercial activities, Roebber said there may be more ice-free days to allow more winter shipping, but fluctuating lake levels may play havoc with lakeshore-dependent businesses during the rest of the year, from expensive marina dredging operations to beach erosion in resort communities. Water quality may be degraded if low lake levels expose harmful chemicals. An additional wild card is the prospect of Wisconsin facing more weather extremes with heavy rains and floods dancing with more frequent short-term droughts.

“It’s not clear how much lower the lake will go, but the levels will become more variable,” Roebber said.

Sitting on our hands

This month, 13 federal agencies released the government’s latest major assessment that human activities are “the dominant cause” of the warmest period “in the history of modern civilization.” That report predicts a 9.5-degree rise in average temperatures in the Midwest under continued high-emission scenarios, the greatest rise of any region in the contiguous United States.

But it is not clear how much researchers will be able to refine their predictions. The Trump administration, despite approving the release of the congressionally mandated report, is in the midst of an unprecedented attack on climate change research. Climate change experts in the Interior Department have been reassigned. The Environmental Protection Agency has banned some scientists from speaking at climate change conferences. The Trump administration has proposed hundreds of millions of dollars of cuts to NASA and NOAA planetary and weather research that relates to climate change.

The assault is also at the state level. Last year, Wisconsin governor Scott Walker ordered state agencies not to comply with President Obama’s Clean Power Plan and his DNR removed references from its website saying human activities are the root cause. Despite its prior partnering with university researchers, the DNR currently says, “The earth is going through a change. The reasons for this change at this particular time in the earth’s long history are being debated and researched by academic entities outside the Wisconsin Department of Natural Resources.”

In this environment, exacerbated by years of prior Congressional budget cuts that constrict the chances of winning federal research grants, Roebber fears for the further erosion of the nation’s ability to protect lives and livelihoods with science.

Destructive weather events are virtually certain to increase. A report this fall by the Universal Ecological Fund calculates that weather events that currently cost the US $240 billion a year will increase to $360 billion annually over the next decade, the latter cost being equal to 55 percent of the current growth of the US economy.

“Facts used to be something we used to solve difficult things and innovate,” Roebber said. “Why the political process is now so destructive to such an important function of society and why the (political) climate has almost become antagonistic toward education is troubling. We’re sitting on our hands instead of accelerating the things we need to do.”

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs