Combined UCS Blogs

Safer Blood Products: One Researcher’s Story on Why Federal Support Matters

UCS Blog - The Equation (text only) -

In 1982, a crisis was beginning to unfold. Gay men were dying of an unknown cause, which years later was shown to be the Human Immunodeficiency Virus (HIV).  At that time, I was not involved with the gay community, with acquired immunodeficiency syndrome (AIDS), or with HIV. But federal funding of my research on blood products helped us prevent the transmission of HIV and hepatitis to tens of thousands of Americans.

I led a small team of research scientists at the New York Blood Center (NYBC) interested in developing new therapeutic products from plasma, the fluid portion of blood. What was known in 1982 was that a plasma product called AHF used in the treatment of hemophilia occasionally transmitted hepatitis B virus and transmitted another virus eventually to be known as hepatitis C. The risk of hepatitis C in this patient group was accepted because the infection was believed to be mild and the benefit of treating the patient with the plasma product was great.

The challenge

If we were going to succeed in developing new plasma products useful to large numbers of patients, such as ones that accelerate wound healing, we had to eliminate viral risk. The only way of doing this with certainty was the use of viral killing methods. The challenge was to find methods that would kill large quantities of virus without damaging the therapeutic protein.

Finding a solution

Supported by the virology laboratories and others at NYBC and based on preliminary studies demonstrating virus kill, in 1983 we received an award from the National Institutes of Health (NIH) totaling just over $750,000 for the “Detection and inactivation of non-A, non-B hepatitis agents in blood”. This award enabled us to greatly accelerate our work which, by that time, included exploring the use of organic solvents and detergents such as had been used in the preparation of viral vaccines. The idea was to disrupt viral structures by stripping away essential fatty acids with the hope that the proteins of interest would be unaffected. Our hopes were fully realized.

We showed that the method we developed, commonly referred to solvent/detergent or SD treatment, completely inactivated hepatitis B and C viruses in a chimpanzee model, and, in collaboration with Dr. Gallo at the NIH, we showed that HIV was rapidly and completely inactivated. As importantly, the valuable proteins such as AHF appeared to be unaffected.

Based on these results, the Food and Drug Administration (FDA) licensed the NYBC’s plasma product for the treatment of hemophilia in 1985. More complete clinical studies run cooperatively by NYBC and the FDA showed that the AHF protein was undamaged and HIV and hepatitis viruses were not transmitted.

Impact

For the next fifteen years, over 60 organizations worldwide adopted SD technology and applied it to a wide variety of products including AHF, intravenous immune globulin used in the treatment of immune deficiency disorders, and even monoclonal antibodies and other recombinant technology derived proteins. Hundreds of millions doses of SD-treated products have been infused in people; countless transmissions of Hepatitis B, Hepatitis C, and HIV were eliminated; and the lives of tens of thousands of patients were saved or improved.

The importance of federal support

Success stories like these are not guaranteed. Without federal support, I am reasonably certain that our findings would have made for a nice publication or two and little else. Additional federal grant support that I received resulted in improving the consistency and viral safety of transfusion plasma, now available broadly, and spawned efforts leading to red cells and platelet products with enhanced viral and bacterial safety.

I am forever grateful for the grant support that I received, and the granting agencies and the nation should take pride in the initiatives they foster. My, no really our story, demonstrates the impact of federal funding and the degree to which the scientific enterprise is a collaborative effort, bringing together many diligent minds from research institutes, private organizations and multiple federal agencies. We should all hope that this continues unabated. Our population deserves it.

 

 

Dr. Bernard Horowitz is recognized internationally for his research on blood viral safety and the preparation and characterization of new therapeutics from blood. He has served on several company scientific advisory boards and as a director of Omrix Therapeutics, Biogentis, Inc., Dermacor, Inc., Protein Therapeutics, and V.I. Technologies, a company he co-founded. At the New York Blood Center, Dr. Horowitz was its Vice President for Commercial Development and a Laboratory Head in its Lindsley F. Kimball Research Institute. He has served as a scientific consultant to the National Institutes of Health, the Food and Drug Administration, the National Hemophilia Foundation, the International Association of Biological Standardization, and the World Health Organization. Dr. Horowitz is the recipient of several prestigious awards, including the Robert W. Reilly Leadership Award from the Plasma Protein Therapeutics Association, the Morton Grove Rasmussen Prize from the American Association of Blood Banks, and the 11th International Prix Henri Chaigneau from l’association francaise des hemophiles. Dr. Horowitz received his B.S. in biology from the University of Chicago and his Ph.D. in biochemistry from Cornell University Medical College.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

An Innovative Way to Encourage Disaster Preparedness: FEMA’s Public Assistance Deductible

UCS Blog - The Equation (text only) -

The Federal Emergency Management Agency (FEMA) recently outlined a new framework for encouraging states to invest in disaster resilience and thus limit the growing costs of disasters.

Today is the comment deadline for the ‘Public Assistance deductible,’ a concept that can help protect communities and ensure federal taxpayer dollars are spent wisely. The Union of Concerned Scientists is filing comments supportive of this idea, with some important recommendations for improvements.

What is the Public Assistance program?

FEMA’s Public Assistance (PA) program provides funding for local, state, and tribal governments to help communities recover from major disasters. The PA program provides funding for debris removal; life-saving emergency protective measures; and the repair, replacement, or restoration of disaster-damaged publicly owned facilities and the facilities of certain private non-profit organizations.

Federal taxpayers pay for at least 75 percent of the eligible costs, and state and local entities pay the rest. According to FEMA, on average, the Public Assistance program has provided approximately $4.6 billion in grants each year over the past decade.

Ideally, program funds would be invested in ways that ensure communities build back in a stronger, more resilient way. But in practice this is not always the case.

After disasters, there is often a strong impetus to build back the same way, right where things were before—which may not be the best long-term choice. For example, in many places sea level rise is worsening risks of coastal flooding. Storm surges, riding on higher sea levels, will be able to reach further and higher inland causing greater damage. Post-disaster recovery efforts need to take into account scientific data and projections of sea level rise, and not perpetuate risky rebuilding in harm’s way.

An innovative model: The Public Assistance Deductible

Under FEMA’s proposed design for this deductible, states would first have to meet a minimum threshold of expenditures on post-disaster recovery before FEMA would provide federal assistance through the PA program. The state-specific deductibles would be calculated using a formula that starts with a common annual national base deductible equivalent to the median average amount of Public Assistance received across all 50 states in the past 17 years. This would then be adjusted by taking into account a state’s fiscal capacity and its disaster risk relative to other states.

The truly innovative part of the proposal would be that FEMA would allow states to buy down their deductible through credits earned for state-wide measures that would help build resilience and lower the costs of future disasters.

FEMA’s framework would include $3 in deductible credit for every $1 in state spending on qualifying mitigation activities. These could include protective building codes, measures to safeguard or enhance wetlands and other nature-based flood protections, flood risk management standards (that require building at least 2-3 feet above base flood elevation), and state investments in mapping, tools, and enhanced hazard mitigation plans to help identify and reduce disaster risks.

For further details on FEMA’s PA deductible framework, including the calculation of the state-specific deductibles and credits, see the Supplemental Advanced Notice of Proposed Rulemaking (SANPRM).

Why UCS supports a well-designed Public Assistance Deductible

UCS supports the concept of a well-designed deductible for the PA program. We believe this could be an effective way of addressing a number of priorities for reform of federal disaster funding previously identified by the Government Accountability Office, Congress, and by the Department of Homeland Security’s Office of the Inspector General.

FEMA has been urged to find ways to reduce the costs of federal disasters. Rather than use blunt, inequitable methods that just transfer costs from the federal government to state, local, and tribal governments, we are encouraged that FEMA is exploring ways to help lower the costs of disasters through hazard mitigation measures. The PA deductible program would give states an incentive (through the crediting mechanism) to take pre-disaster protective actions to make communities more resilient to disasters.

With careful attention to the details of such a proposal, this type of concept could help advance community resilience and ensure that taxpayer dollars are wisely invested.

Goals of a Public Assistance Deductible

Studies show that investments in pre-disaster risk reduction measures have a payback of at least 4 to 1 and estimates of the benefits of investments in flood mitigation are higher at 5 to 1. Studies from the Wharton School at the University of Pennsylvania and Swiss Re indicate that higher design standards have a far higher payback than 4 to 1. Yet our nation continues to under-invest in sensible pre-disaster hazard mitigation measures, even as more lives are lost and the costs of weather and climate disasters grow.

In our comments, we have urged FEMA to ensure that the PA deductible program:

  • Provides a strong incentive for states to invest more in pre-disaster preparedness measures that would help limit harm to people, property, and natural functions of ecosystems over the long-term and ensure that taxpayer dollars are spent wisely. We’ve recommended that FEMA expand the list of measures that qualify for credits to include state-wide protective freeboard regulations that go above the minimum federal requirements, investments in hazard mitigation (with extra credit for nature-based, green measures such as wetlands and living shorelines and less credit for hardening measures such as levees), enhanced design standards, building codes, protective zoning standards, statewide flood mapping (e.g. North Carolina), and strategic plans and incentives for managed retreat from high-risk areas.
  • Is administered in an equitable way to help protect the most vulnerable communities, especially low-income and minority communities. We have recommended that FEMA provide extra credits for investments made to promote resilience in these disadvantaged communities.
  • Takes into account the best available science on growing climate risks including: flooding worsened by sea level rise and more frequent and heavy downpours; droughts; wildfires; and other types of impacts related to climate change. We strongly recommend that the PA deductible is designed in a way that appropriately incorporates climate projections and future risks. The hazard model (Hazus) that FEMA is proposing to use to assess state risks is not currently configured to do this, and must be appropriately updated if it is used.
  • Fairly takes into account the specific circumstances of states, so that they are able to access the much-needed aid and recovery resources that are their due in the wake of major disasters, in line with the provisions of the Robert T. Stafford Disaster Relief and Emergency Assistance Act.

A science-based, equitable approach to building resilience to disasters will go a long way toward protecting people, property and the functions of our natural ecosystems, while ensuring the best use of taxpayer dollars. UCS has developed a set of principles that can help inform this type of approach, Toward Climate Resilience: A Framework and Principles for Science-Based Adaptation.

A step toward building climate resilience

The PA deductible concept is an important step toward making our nation more resilient to disasters. We’ll be looking for expeditious action from FEMA to take our and other public comments into account and move this forward toward a rulemaking process.

Congress and the Trump administration also need to act on multiple fronts to help communities facing the impacts of climate change and other disasters. One thing they definitely shouldn’t do is make harmful cuts to FEMA’s budget.

The Public Assistance Deductible infographic provides a snapshot overview of the basic concept and an example of how one State, in this case Indiana, could earn credits toward offsetting the deductible amount.  Source: FEMA

Is No Place Safe? Climate Change Denialists Seek to Sway Science Teachers

UCS Blog - The Equation (text only) -

Co-Authors: Glenn Branch, Deputy director of NCSE, and Steven Newton, Programs and Policy Director at NCSE

A few weeks ago, science teachers across the country began to find strange packets in their school mailboxes, containing a booklet entitled “Why Scientists Disagree About Global Warming” (sic), a DVD, and a cover letter urging them to “read this remarkable book and view the video, and then use them in your classroom.”

Heartland Institute Report Cover Spring 2017 with "Not Science" stamp

“Not Science” stamp on top of the report cover mailed to teachers during spring 2017. The report misrepresents the fact that nearly all climate scientists agree about human-driven climate change.

The packets were sent by the Heartland Institute, which in the 1990s specialized in arguing that second-hand smoke does not cause cancer. Even though its indefensible defense of the tobacco industry failed, Heartland now uses the same pro-tobacco playbook—touting alleged “experts” to question established science—to argue that climate change is not real.

At the National Center for Science Education, we have almost three decades of experience helping teachers, parents, and students facing creationism in the classroom. A few years ago, we added climate change to our docket. So teachers know that when issues regarding evolution or climate change come up, NCSE is there to help.

This wasn’t Heartland’s first unsolicited mailing of climate change denial material to science teachers, and judging from the reactions we’ve seen, teachers haven’t been fooled by this outing. But here is how we’re advising science teachers to explain why using these materials in any science classroom would be a terrible idea.

1. Virtually every assertion is false, controversial, or at best unclear.

That’s a judgment that might seem to call for a point-by-point rebuttal. But I’m not going to offer such a rebuttal, both because every substantive point in the Heartland mailing is a long-ago-debunked canard (see Skeptical Science passim) and because there is already a place where responsible scientists discuss the evidence for climate change: the peer-reviewed scientific research literature.

If Heartland has such a good case to make, why is it spending thousands of dollars on direct-mailing a self-published report to teachers, instead of trying to convince the relevant scientific community?

2. Heartland represents what is, at best, a fringe position in science.

Of course, Heartland isn’t willing to admit its fringiness, devoting considerable effort to trying to dispute the widely reported fact that the degree of scientific consensus on anthropogenic climate change is about 97 percent. It’s a wasted effort.

Multiple independent studies, using different sources, methods, and questions, have arrived at the same conclusion. And the scientific consensus on climate change is not a mere reflection of popular sentiment or shared opinion among scientists. Rather, it is the product of evidence so abundant and diverse and robust as to compel agreement in the scientific community.

3. Heartland even disparages the well-respected, Nobel-Prize-winning, IPCC.

Not content to reject the extraordinary scientific consensus on climate change, the booklet downplays the process by which climate scientists regularly evaluate and report on the state of the evidence, the Intergovernmental Panel on Climate Change or IPCC.

Few areas of science undergo the kind of rigorous and comprehensive review that the climate science community carries out every five years. It is a reflection of the seriousness with which world leaders take the challenge of climate change that they support this process and accept the conclusions arrived at by hundreds of generous, dedicated, and meticulous scientists.

4. Heartland’s material contradicts standards, textbooks, and curricula.

K–12 teachers are expected to teach in accordance with state science standards, state- or district-approved textbooks, and district-approved curricula, all of which undergo review by competent scientists and teachers, and thus generally attempt to present climate change in accordance with the scientific consensus. Heartland’s materials have not undergone such a review. And teachers who misguidedly use them in the classroom will be, at best, presenting mixed messages, running the risk of confusing their students about the scientific standing of anthropogenic climate change.

5. Heartland’s citations are shoddy and its tactics dishonest.

Many of the references in “Why Scientists Disagree About Global Warming” (sic) are to Heartland’s own publications, post on personal blogs, fake news sources, and low-quality journals—the sort of citations that a teacher wouldn’t accept on a science assignment.

The booklet itself is credited to the Nongovernmental International Panel on Climate Change, NIPCC—likely to be confused with the legitimate IPCC. And the envelope in which the mailing was sent reproduced a New York Times headline about “Climate Change Lies”—the same sort of lies, it turns out, that Heartland is concerned to promote.

In the end, the climate change deniers at the Heartland Institute have no scientifically credible evidence of their own, leaving them with no option but to lash out at the real scientific literature, contributing nothing except vitriol, achieving nothing except confusion. Science teachers know better—and science students deserve better.

Two for One: A Very Bad Deal for Our Nation

UCS Blog - The Equation (text only) -

Imagine you are in the market for a new car. You are excited to buy one with a new technology that will warn you of an imminent crash so you have enough time to hit the brakes to save your son’s or daughter’s life and your own. The car salesman tells you he’s got just the car for you, and it comes with his new two-for-one deal. To get that one new feature, you have to give up two others, brakes and seat belts.

You’d never take that deal, but it is exactly the kind of situation the President has created for the National Highway Traffic Safety Administration (NHTSA) and every other agency responsible for protecting American’s health and safety.

This “two-for-one” executive order, signed January 30th, 2017, requires every agency to get rid of at least two regulations for every new one they seek to put in place to help make American’s lives better off. Making matters worse, the health, safety, and other regulations that must be eliminated must at least offset the industry investment required to meet the new regulation–regardless of the benefits of the new or older regulations!

So, take my not-so-hypothetical example above. When I was NHTSA’s Acting Administrator, we put out an advanced notice of proposed rulemaking that would require new cars to come equipped with radios that would allow them to “talk” to one another, sharing basic safety information that would allow a car car to warn the driver of another equipped vehicle on a collision course. This vehicle to vehicle, V2V, communication system is estimated to prevent 425,000–524,500 crashes per year when fully implemented. Saving lives and avoiding injuries would deliver savings of $53 to $71 billion, dwarfing the investments automakers would have to make to equip vehicles with the new technology, therefore delivering positive net benefits within 3-5 years.

But under the “two-for-one” executive order, those benefits just don’t matter, the lives saved and injuries avoided just don’t matter. Instead, other regulations, like those requiring seat belts and brakes, would need to be repealed to offset the investment costs… again, ignoring the lives lost and harmed along the way. And if those two don’t cut the costs to industry enough, even more would need to be eliminated, putting even more lives at risk.

When you consider that in 2015 alone, 35,092 people lost their lives and 2.44 million people were injured in traffic crashes in the United States, it is clear that the “two-for-one” executive order is a very bad deal for our nation.

Making matters worse, this same raw deal applies to fuel economy standards that NHTSA is set to finalize for 2022-2025 to help nearly double fuel economy compared to where we were at the beginning of the decade. So, will NHTSA have to repeal safety standards to make more room to cut the high cost of our oil use? I expect they would never make that trade. I expect it would be the same for the Department of Energy (DOE), where I had the opportunity to help establish efficiency standards for household and commercial appliances. I don’t think the DOE would repeal appliance efficiency standards that are estimated to save consumers more than $2 trillion by 2030 if they had to both offset the industry investment costs of new ones and ignore the benefits of them all.

The “two-for-one” executive order is good for only one thing: grinding to a halt federal efforts to save lives, protect our health, and help us spend less money fueling our cars and heating and cooling our homes.

Appendix: Background on Regulation at NHTSA

Managing Nuclear Worker Fatigue

UCS Blog - All Things Nuclear (text only) -

The Nuclear Regulatory Commission (NRC) issued a policy statement on February 18, 1982, seeking to protect nuclear plant personnel against impairment by fatigue from working too many hours. The NRC backed up this policy statement by issuing Generic Letter 82-12, “Nuclear Power Plant Staff Working Hours,” on June 15, 1982. The Generic Letter outlined guidelines such as limiting individuals to 16-hour shifts and providing for a break of at least 8 hours between shifts. But policy statements and guidelines are not enforceable regulatory requirements.

Fig. 1 (Source: GDJ’s Clipart)

UCS issued a report titled “Overtime and Staffing Problems in the Commercial Nuclear Power Industry” in March 1999 describing how the NRC’s regulations failed to adequately protect against human impairment caused by fatigue. Our report revealed that workers at one nuclear plant in the Midwest logged more than 50,000 overtime hours in one year.

Barry Quigley, then a worker at a nuclear plant in the Midwest, submitted a petition for rulemaking to the NRC on September 28, 1999. The NRC issued regulations in the 1980s intended to protect against human impairment caused by drugs and alcohol. Nuclear plant workers were subject to initial, random follow-up, and for-cause drug and alcohol testing. Quiqley’s petition sought to extend the fitness-for-duty requirements to include limits on working hours. The NRC revised its regulations on March 31, 2008, to require that owners implement fatigue management measures. The revised regulations permit individuals to exceed the working hour limits, but only under certain conditions. Owners are required to submit annual reports to the NRC on the number of working hour limit waivers granted.

The NRC’s Office of Nuclear Regulatory Research recently analyzed the first five years of the working hour limits regulation. The analysis reported that in 2000, the year when the NRC initiated the rulemaking process, more than 7,500 waivers of the working hour limits suggested by Generic Letter 82-12 were being issued at some plants while about one-third of the plants granted over 1,000 waivers annually. In 2010, the first year the revised regulations were in effect, a total of 3,800 waivers were granted for the entire fleet of operating reactors. By 2015, the number of waivers for all nuclear plants had dropped to 338. The Grand Gulf nuclear plant near Port Gibson, Mississippi topped the 2015 list with 69 waivers. But 54 (78%) of the waivers were associated with the force-on-force security exercise.

The analysis indicates that owners have learned how to manage worker shifts within the NRC’s revised regulations. Zero waivers are unattainable due to unforeseen events like workers calling in sick and tasks unexpectedly taking longer to complete. The analysis suggests that the revised regulations enable owners to handle such unforeseen needs without the associated controls and reporting being an undue burden.

The regulatory requirements adopted by the NRC to protect against sleepy nuclear plant workers should let people living near nuclear plants sleep a little better.

Will Scott Gottlieb Comply with Industry Plea to Stall Added Sugar Label?

UCS Blog - The Equation (text only) -

President Trump’s nominee to head the U.S. Food & Drug Administration (FDA), Scott Gottlieb, faced the Senate in his nomination hearing on Wednesday, during which he implied that delayed implementation of the science-based nutrition facts label revision would be possible if he is confirmed.

Yes, you read that right. The future chief of an agency dedicated to protecting public health is already hinting at his willingness to do industry’s bidding to push back enforcement of a rule based in solid science that would help us make informed food purchasing decisions to improve our health. But his alignment on industry talking points is not completely shocking. Mr. Gottlieb has a long list of ties to industry, including an extensive financial and professional relationship with several pharmaceutical companies that manufacture and sell opioids.

During the hearing, Senator Pat Roberts told Gottlieb that the deadline of summer 2018 was not enough time for industry to make the required label changes, including the new added sugar line, especially considering that companies will have to include biotechnology disclosures on labels soon as well. To the question of whether he would “work to ensure proper guidance is available and consider postponing the deadline for the Nutrition Facts Panel to help reduce regulatory burdens?” Gottlieb didn’t explicitly say he would postpone the deadline but might as well have:

“This is something that I do care about and I look forward to working on if I am confirmed,” Gottlieb said. He continued to explain that he is, “philosophically in favor of trying to make sure we do these things efficiently, not only because it imposes undue costs on manufacturers to constantly be updating their labels, but we also have to keep in mind it creates confusion for consumers if the labels are constantly changing…you want to try to consolidate the label changes when you are making [them] as a matter of public health so that the information is conveyed accurately and efficiently to the consumers.”

Why delay?

The delay tactic is often used by industry as a fallback plan, once they’ve failed altogether to stop a science-based policy that might impact their bottom line. This is old hat for the food industry. Back in December, I wrote about how the Food & Beverage Issue Alliance (a group made up of the biggest food and beverage trade associations, like the American Beverage Association and the Grocery Manufacturers Association) had written a letter to the acting HHS secretary and USDA secretaries asking to delay the implementation of the nutrition facts rule to coordinate with U.S. Department of Agriculture’s biotechnology disclosure rule. Some of the same players doubled-down on a similar letter in March, asking HHS Secretary Tom Price to delay the rule until May 2021 for the same reason. Scott Gottlieb’s remarks at his hearing closely resemble the sentiments contained in both of those letters.

Sound familiar? Time and time again we’ve seen science-based proposed rules never make it to the final stages, or those that are finalized but implementation is soon delayed. Just last week, EPA administrator, Scott Pruitt, issued a proposed rule that would delay implementation of the Risk Management Plan (RMP) amendments 20 months, until February 19, 2019. This move came after several petitions from the American Chemistry Council and a handful of other chemical manufacturing corporations, oil and gas companies, and trade organizations asked the agency to reconsider the rule.

And remember the silica rule? Although the science had been clear for over forty years, it took the Department of Labor longer than necessary to issue a final rule late last year which tightened the standard, thanks to opposition from the American Chemistry Council and the U.S. Chamber of Commerce. Just yesterday, the department announced that the rule’s enforcement would be delayed because the construction industry needs more time to educate its employees about the standard.

Industry’s reaction to rules that protect our public health makes it seem like government is blindsiding them. But it’s not like any of these rules were dropped without warning or without cause. These safeguards take years to gather information for and write, during which industry is given ample opportunity to be involved in the process. FDA first began its work to revise the nutrition facts label in 2004, and the proposed rule which included the added sugar line was issued in 2014. Not exactly rapid response. The fact is that science-based policies threaten business as usual, and therefore industry will use all resources at its disposal to stop or slow progress.

Industry’s excuses are wearing thin

Once again, with clear science on the public health consequences associated with excessive added sugar consumption, we have been waiting long enough for full added sugar disclosure on labels. While we wait, we’re missing out. The estimated benefit to consumers of the revisions to the nutrition facts label consumers would be $78 billion over 20 years, not to mention the less quantifiable benefits that come with the right to know how much added sugar is in the foods we buy and eat.

The majority of companies already have until 2019 to make the new changes to their labels and larger food companies like Mars, Inc. have said they could meet the July 2018 deadline just fine.

It’s clear that industry is turning this science-based decision into a political one, at the expense of Americans who will remain in the dark about how much added sugar is in their food for even longer. As National Public Health Week draws to a close, I can’t help but think about the urgent need for progress now, not in four years, if we’re to improve the health of this country, let alone become the healthiest nation by 2030. If Mr. Gottlieb secures the FDA secretary position, he must remember that he is beholden to our public health, not the pharmaceutical or food industry’s bottom line.

Here’s What the EPA Budget Cuts in a Leaked Memo Mean for Health and Environmental Justice

UCS Blog - The Equation (text only) -

Recent news reports point to a leaked memo that provides more details about the Trump administration’s proposed deep cuts to the Environmental Protection Agency’s (EPA’s) budget. If the details are officially confirmed, it would clearly show that the administration is preparing to undermine health protections nationwide, and especially in low income and minority communities. The administration is also seeking to undercut the role of sound science at the agency.

Congress should refuse to allow these harmful cuts to go forward.

How the budget cuts hurt the EPA’s work

Here’s the big picture: If implemented, the deep budget and staffing cuts proposed by the Trump administration would undermine the core mission of the EPA to protect human health and the environment. There is simply no way for the agency to continue to do its job well while losing about a third of its overall budget, with even deeper cuts to many critical programs.

Here are just three of the many important aspects of the EPA’s work that are harmed by the proposed budget cuts outlined in the leaked memo:

1. Programs critical for public health, the environment and the economy of states.

The Trump administration is attempting to cut budgets and funding for programs that are critical for states. These include:

  • Cuts to grants for state, local and tribal management of air and water quality. These grants are critical for state and local authorities to monitor and enforce air and water pollution safeguards. UCS President Ken Kimmell, former Commissioner of the Massachusetts Department of Environmental Protection, recently explained how states are in no position to make up for shortfalls that arise from EPA budget and staffing cuts. This will inevitably threaten public health protections.
  • Cuts to Children’s Health Program resources. The leaked memo says “This decision reduces Children’s Health program resources by $2,391K and 14.9 FTE to prioritize core environmental work.” Wow, that’s stunning! So protecting children’s health is NOT core work for the EPA? That would be news to the American public.
  • Total elimination or cuts to many EPA regional programs, including ones focused on the Chesapeake Bay, the Gulf of Mexico, the Great Lakes, South Florida, San Francisco Bay and Puget Sound. All these programs not only help reduce pollution, they are also vital for the regional economies. The Chesapeake Bay program, for example, is a collaborative effort between Delaware, Maryland, New York, Pennsylvania, Virginia, West Virginia, the District of Columbia, the Chesapeake Bay Commission and the EPA, focused on reducing the pollution load in the historically beleaguered Bay and thereby supporting local economies, fishing, swimming, tourism, and protecting drinking water sources (with benefits accruing in waterways well beyond the Bay itself.)
  • Major cuts to the budget of the Office of Enforcement and Compliance Assurance, including cuts to Civil and Criminal Enforcement and Compliance Monitoring. It’s really hard to see these cuts as anything but a sellout to polluting industries. Robust enforcement is what gives teeth to our nation’s pollution laws, including the Clean Water Act and the Clean Air Act.
  • Cuts to Superfund enforcement. Superfund sites are among the most polluted sites in the country and EPA works to help clean up hazardous waste and monitor these sites. Take a look at this map and see if you have one of the Superfund sites that made the National Priority List for clean-up near where you live. Just to give a sense around the country: Alaska has 10 Superfund sites, Tennessee has 28, Alabama has 18, California has 112, and Maine has 16. If you live in or near one of the sites that still need remediation, cuts to the EPA’s budget could directly affect you.
  • Cuts to programs that help reduce the risk of pesticides to human health and the environment. Administrator Pruitt has already set a bad precedent through his decision not to ban chlorpyrifos, a pesticide that poses a clear risk to children, farm workers, and rural drinking water users. Cuts to budgets for programs that limit pesticide risks would just continue down that misguided path.
2. Protections for environmental justice communities, especially low-income, minority and tribal communities

Because EPA’s core mission is the protection of public health, its activities are especially important for communities that bear a disproportionate burden of health impacts from pollution. Many of these environmental justice (EJ) communities are low-income, minority and tribal communities. Harms to these communities will be especially pronounced if the EPA’s overall budget is slashed.

As a quick reminder, here’s how the EPA defines environmental justice:

Environmental justice is the fair treatment and meaningful involvement of all people regardless of race, color, national origin, or income, with respect to the development, implementation, and enforcement of environmental laws, regulations, and policies. 

The agency says this goal will be achieved for all communities and people when everyone enjoys:

  • the same degree of protection from environmental and health hazards, and
  • equal access to the decision-making process to have a healthy environment in which to live, learn, and work.

It’s hard to see who would be opposed to these fundamentally fair and commonsense goals, but it’s entirely in keeping with an administration that has shown itself to be hostile to concerns about racial justice across the board.

In addition to overarching budget cuts that will disproportionately hurt EJ communities, the administration is also proposing to cut specific EPA programs targeted at disadvantaged communities. That’s gratuitously cruel, especially given the small budgets associated with these programs.

Here’s a list of some of the most egregious cuts to EJ priorities: elimination of the EPA’s Office of Enforcement and Compliance Assurance’s Environmental Justice program (and its small grants program); cuts to budgets for compliance with Title VI of the Civil Rights Act, elimination of the lead risk reduction program and state grants for lead monitoring and enforcement; and cuts to the Brownfields program that helps remediate contaminated sites and revitalize communities.

Consider the cuts in funding for lead risk reduction programs. States and local jurisdictions simply do not have the funding or the expertise to make up for cuts in federal funding for these vital programs. According to the CDC, which maintains the latest county-level data for lead levels:

Today at least 4 million households have children living in them that are being exposed to high levels of lead. There are approximately half a million U.S. children ages 1-5 with blood lead levels above 5 micrograms per deciliter (µg/dL), the reference level at which CDC recommends public health actions be initiated.

Lead exposure has serious consequences for the health of children, and can result in behavior and learning problems, lower IQ and hyperactivity, slowed growth, hearing problems, and anemia. What’s more, according to the CDC, African American children are three times more likely than white children to have elevated blood-lead levels, amounting to a public health crisis in some places.

Or consider the work the EPA is doing to help address air quality concerns in tribal communities in Alaska. Pollution from diesel emissions, indoor air quality concerns, and emissions from burning solid waste and from wood-burning stoves are among the serious challenges these communities face.

Just last year the EPA provided grants totaling over $500,000 through the Brownfields program to Chattanooga and Knoxville, TN. These grants will help disadvantaged communities clean up and revitalize contaminated sites, which in turn will boost the local economy and improve public health. There are many Brownfields success stories around the country.

The recent resignation of Mustafa Ali, a key leader of the EPA’s environmental justice program, is a sad commentary on where this work is likely to be headed under Administrator Scott Pruitt. In his resignation letter addressed to Administrator Pruitt, Ali said:

“When I hear we are considering making cuts to grant programs like the EJ small grants or Collaborative Problem Solving programs, which have assisted over 1,400 communities, I wonder if our new leadership has had the opportunity to converse with those who need our help the most.”

3. Scientific research and data, most prominently climate science

Many aspects of the EPA’s scientific work are under attack, including all of its work related to climate change. Perhaps this is only to be expected under an administration that is peddling a new form of climate denial, but that doesn’t diminish how outrageous these actions are.

(In case you missed it, watch EPA Administrator Scott Pruitt’s widely-panned appearance on Fox News where he continued his dissembling on the “CO2 issue.” The relevant excerpt starts at the 5:08 mark.)

The Trump administration is aiming to eliminate the Office of Air and Radiation’s Climate Protection Program. This program works with state, local and tribal entities to provide expertise on climate solutions including energy efficiency, renewable energy and adaptation to climate impacts. At a time when the seriousness consequences of climate change are so clear, this type of help is sorely needed.

But that’s not all: Trump’s budget proposes to cut funding for the EPA’s Science Advisory Board (SAB), a source of independent peer review for the agency’s scientific and technical information and scientific advice for the EPA Administrator. Congress directed the EPA to set up the SAB in 1978 and it has served a very important role through multiple administrations to help ensure science-based policymaking. The leaked memo literally says that cuts to the funding and staffing for the SAB “reflect an anticipated lower number of peer reviews.” I suppose that means this administration has arbitrarily decided to deprioritize independent science and scientific oversight, a losing proposition for the American public.

In addition, the EPA’s Environmental Education and Regional Science and Technology programs are targeted for elimination. The RS&T program works together with a network of regional laboratories around the country to bring good science to bear on environmental protection measures.

My colleague Dave Cooke highlights other important harms related to potential loss of funding for the EPA Vehicle Lab. And Karen Perry Stillerman has written about the impacts of loss of funding for EPA’s work on clean water.

Congress must resist harmful cuts to the EPA budget 

Some of the broader details of the leaked memo accord with the budget blueprint released by the administration last month, which would indicate that these are likely to be real threats. Senators and Representatives should consider the destructive impacts on their constituents in their home states and speak out against the decimation of the EPA’s budget and staffing.

It’s especially important to elevate the concerns of communities that have historically been sidelined and face a disproportionate burden of pollution. Let’s not have another Flint water crisis, or Elk River chemical spill, or Kingston coal ash spill.

Mustafa Ali’s resignation letter, addressed to Administrator Pruitt, also says:

“I strongly encourage you and your team to continue promoting agency efforts to validate these communities’ concerns, and value their lives.”

Ultimately, that’s what this is about: Not just budget and staffing numbers at the EPA, but the impact on the lives and well-being of people around the country. Congress, which has the final say on the federal budget, must strenuously resist these cuts to the EPA’s budget.

Photo: EPA

Americans Are Worried about Water Pollution (And They Should Be)

UCS Blog - The Equation (text only) -

Apparently the Trump administration hasn’t heard about the latest Gallup poll, which puts Americans’ concerns about water pollution and drinking water at their highest levels since 2001. Why do I say this? Because in addition to rolling back a key Obama-era clean water rule, a leaked EPA memo reveals that the administration intends to slash or eliminate funding for a slew of water programs and initiatives. And while recent and ongoing crises like the one in Flint have highlighted urban drinking water problems, it is also true that rural communities—whose voters helped put President Trump in office—have plenty to worry about.

Gallup’s annual Environment Poll found that 63 percent of Americans worried “a great deal” about pollution of drinking water, and 57 percent have a similar level of concern about pollution of rivers, lakes and reservoirs. Such levels of concern about drinking water were highest among non-white and low-income groups, but were reported by majorities of respondents across racial and income lines.

The Trump administration is trashing clean water protections

Against this backdrop of Americans’ rising water worries, President Trump is taking actions that will actually make the nation’s waters dirtier. First he staffed his administration with Big Ag and Big Oil boosters, including his EPA chief Scott Pruitt. Then he signed an executive order to begin undoing the EPA’s Clean Water rule, over which (not coincidentally) Pruitt sued the EPA while serving as Oklahoma attorney general. To emphasize his disdain, the President called the rule, “horrible, horrible.”

But what’s really horrible is what the Trump administration did next. As the Washington Post reported last Friday, a leaked EPA memo sheds new light on the budget cuts previewed a few weeks earlier. My colleagues have documented how cuts will impact clean vehicle programs and climate research, so here I’ll focus on implications for EPA’s clean water work. Bottom line: it’s worse than you thought. The memo names at least 17 water-focused programs and sub-programs slated for total elimination, and others that would face sharply reduced funding. By my tally, the cuts to EPA Office of Water programs total more than $1 billion.

That’s deeply troubling, because when the administration yanks precious dollars from clean water programs, people and communities suffer. Whether it’s cleaning up pollution in Lake Michigan, restoring wetlands around Puget Sound, preventing farm runoff into the Chesapeake Bay, or testing drinking water in rural Maine that doesn’t happen because there’s no money and no staff, people will be hurt. People’s health, people’s recreational opportunities, people’s livelihoods. And costs that could have been averted balloon instead.

Water worries are rising in farm country

It’s not just urban or industrial communities that will suffer from the Trump administration’s budget cutting. The Washington Post reported last weekend on the irony that many cuts would disproportionately hurt rural communities that supported him, because they rely heavily on federally-funded social programs. The article didn’t mention water pollution, but it’s a fact that water supplies in (and downstream from) agricultural areas bear a heavy burden of contamination from farm runoff. High levels of fertilizer-derived nitrates in drinking water, which can cause severe health problems in infants, are a particular concern. The USDA has estimated the cost of removing agricultural nitrates from public water supplies at about $1.7 billion per year, and the total cost of environmental damage from agricultural nitrogen use has been estimated at $157 billion annually. Rural communities and cities like Des Moines, Iowa, are struggling to deal with the problem. And cuts to EPA monitoring and cleanup programs in rural areas could just make it worse.

A false choice

When the Trump administration talks about gutting environmental protections, their argument seems to boil down to, “because jobs.” But that’s a false choice. And the damage industrial agriculture wreaks on the nation’s water resources is a prime example. It affects millions of Americans—rural and urban water consumers, of course, but also taxpayers responsible for pollution cleanup, and boaters, fishers, and business operators that depend on clean water. And it affects farmers, because they too need clean water and healthy soil to be able to keep farming over the long term.

Last summer, UCS documented the potential benefits to farmers, taxpayers, and businesses from an innovative farming system integrating strips of perennial native prairie plants with annual row crops. Researchers who developed the system in Iowa found that by planting prairie strips on just 10 percent of farmland, farmers could reduce nitrogen loss in rivers and streams by 85 percent, phosphorus loss by 90 percent, and sedimentation by 95 percent. And this is all while maintaining farm productivity.

UCS further estimated that the prairie strips system, if adopted across the nation’s 12-state Corn Belt, would generate more than $850 million per year in net savings to farmers and society from reductions in fertilizer use and surface water runoff. In the coming weeks, we’ll follow up with analysis of another farming system based on extended crop rotations, which also promises to keep farmers profitable while reducing pollution.

Smart farm policy can deliver clean water and rural prosperity

This is timely, because Congress is already at work on the 2018 farm bill, that massive piece of legislation that comes around every five years and shapes the nation’s food and farming system. And while the Trump administration has shown utter disregard for the environment that all Americans depend on, for scientific evidence of what works, and even for the particular needs of the farmers and rural voters who put him in office, we’re betting that more reasonable voices will prevail. We’re mounting a campaign to protect the nation’s precious water resources while simultaneously improving farmers’ yields and creating economic opportunities in rural communities. We will mobilize UCS supporters, form common cause with farmer organizations, and join with other allies to call for policies that invest in such systems. Stay tuned.

The Trump Administration and Children’s Health: An Early Progress Report

UCS Blog - The Equation (text only) -

Parents are used to getting progress reports on how their children are doing—from teachers at school and from health care providers who assess developmental milestones. Early indicators are important; they can identify problems early, trigger needed interventions, or provide welcome assurance that things are looking good.

The news has been full of what President Trump has been doing in the first 90 days of his administration. Let’s do a quick progress report on what he’s done for children’s health.

Big picture

One could start by thinking about what a repeal of the Affordable Care Act might have meant (or still might mean) for children’s health care, especially poor children. Or what Mr. Trump’s efforts to round up and deport undocumented people has meant for their health and well-being (shattering stories here and here). Or the many ways that Mr. Trump’s “2 for 1” executive order could impact children. Or how his agenda to roll back public protections (i.e., regulations) will affect the many determinants of health, like clean air, water, food. (Children are especially vulnerable to environmental conditions, as they take in more air, water, and food per pound than adults.)

And then, of course, there’s the administration’s efforts to sow doubt about climate science and roll back safeguards that limit harmful climate pollution. Here is what our nation’s pediatricians have to say about climate change and child health.

Closer look

These past two weeks give us another window into the administration’s stance on specific environmental threats to children’s health.

Strike one. Rejecting the conclusions of its own scientists, last week the EPA Administrator announced a “final agency action” that it would not ban a pesticide (chlorpyrifos) that poses a clear risk to child health. This after years of independent study and solid scientific evidence (here, here, here) that the pesticide poses a developmental risk to children. (In 2000, the EPA phased out its use around homes, schools, day care centers and other places where children might be exposed.)

And here’s the kicker: the next time the EPA is required to re-evaluate the safety of this pesticide is 2022! That means another five years of exposure to this widely-used pesticide that poses a clear risk to the developing brains of children.

Strike two: Last week, the EPA released its proposed budget for FY18, aligning its budget with President Trump’s war on the EPA. The proposal eliminates two agency programs that help protect children from exposure to lead, a potent neurotoxin. Not trim , not cut, but eliminate!

One program trains and certifies workers involved in lead abatement in buildings that may have lead-based paint. The other is a grant program to states and tribal jurisdictions that address lead-based paint. We can likely expect a groundswell of protest from the nation’s public health community, which well-understands the grave risks of lead exposure to children, and the singular value of primary prevention. They also recognize the important role that the EPA has played in protecting children’s health over the past two decades.

Strike three: Stand by. I fear it won’t be long.

The irony

I’m struck that I’m blogging about two well-recognized and highly researched environmental health threats to children during National Public Health Week, and at the same time that the Children’s Environmental Health Network (CEHN) is holding its 2017 research conference in Arlington, VA. Among other things addressed by these children’s health specialists: pesticides and lead!

What to do?

At the risk of repeating myself—(OK, I am)—we need to remember that our government, including Mr. Pruitt’s EPA, works for us. Our public health and the health of our children should override private interests. We have voice. We have science behind us. We need to speak up, show up, and let our leaders know that we are watching and that attempts to roll back public protections IS NOT ALL RIGHT.

UCS has tools and resources to help do that. Join us—and others—in this fight.

Photo: Petra Bensted/CC BY (Flickr)

Déjà vu all over again: Heartland Institute Peddling Misinformation to Teachers about Climate Change

UCS Blog - The Equation (text only) -

I have had the thrill of sharing the latest discoveries in the classroom with students who asked probing questions, when I was a faculty member of a University.  That journey of discovery is one that parents and family members delight in hearing about when students come home and share what they have found particularly intriguing.

What if the information the student shared was not based on the best available evidence?  Misinformation would begin to spread more widely.  If corrected, the student might distrust the teacher who may have not known the source material was compromised.

This scenario is not fiction.  It has happened and may still be occurring in some U.S. schools.  Anyone concerned about this can learn more with an update forthcoming from those who keep track – the National Center for Science Education (NCSE).

According to the NCSE, during October 2013 educators received a packet chock full of misinformation about climate change.  The report includes an abbreviation that looked similar to a highly respected source – the Intergovernmental Panel on Climate Change (IPCC) – for international climate assessments.

It has happened again (starting in March 2017).  Many teachers found a packet in their mailbox with a report from the same group that spread the misinformation back in October 2013.  This report has a “second edition” gold highlight with a cover image of water flowing over a dam and a misleading title.

Heartland Institute Report Cover Spring 2017 with "Not Science" stamp

“Not Science” stamp on top of the report cover mailed to teachers during spring 2017. The report misrepresents the fact that nearly all climate scientists agree about human-driven climate change.

The report runs counter to the agreement among scientists who publish on climate change in the peer-reviewed scientific literature. More than 97% of scientists agree that climate change is caused by human activities

The Heartland Institute is infamous for its rejection of climate science and unsavory tactics.  According to a reported statement by the CEO of Heartland Institute, they plan to keep sending out copies to educators over the weeks ahead.

If you see any student or teacher with this report or DVD please let NCSE know about it and share what you have learned to help stop the spread.

What is EPA’s Vehicle Lab, and Why Should I Care How It’s Funded?

UCS Blog - The Equation (text only) -

More details have been released about the Trump administration’s plans to cut funding to the Environmental Protection Agency (EPA).  In particular, it is nearly zeroing out the budget for the vehicles program, calling for the National Vehicle and Fuels Emission Laboratory (“Vehicle Lab”) in particular to be funded almost entirely by fees on industry “as quickly as possible” (i.e. as soon as never).  This could significantly undermine the enforcement of safeguards which protect American pocket books and public health from industry malfeasance, and it could put in jeopardy technical research that moves technology forward.

The Vehicle Lab plays a critical role in watchdogging industry

Portable emissions measurement system (PEMS) like the one used to uncover the Volkswagen scandal were developed by EPA researchers at the Vehicle Lab.

EPA’s Vehicle Lab, located in Ann Arbor, MI (Go Blue!), is responsible for certifying manufacturer compliance with its emissions standards—before any vehicle can be sold in the United States, it must be approved by the EPA.  EPA does not test every passenger vehicle model—the lab is under-resourced for such an endeavor.  Instead, it randomly selects vehicle models (about 15-20 percent annually) to assess the accuracy of manufacturers’ test results.  It also conducts its own investigations if any anomalous data is brought to its attention, e.g., by consumer groups or other advocacy organizations.

Just in the last couple of years alone, several manufacturers from across the industry have faced fines, or worse, thanks to this oversight:

Fiat-Chrysler—Its Jeep and Ram diesel vehicles are currently being investigated for violating the Clean Air Act.  While the case is ongoing, it represents an effort by EPA to step up its real-world emissions tests to ensure that vehicles are not polluting above what is legally allowed and public health is not being harmed.

Ford—For the 2013 and 2014 model years, 6 different vehicles were required to adjust the fuel economy label information provided by consumers—for one of those (the C-MAX), this was actually the second such adjustment.  This resulted in payouts to consumers of up to $1050.

Hyundai and Kia—The Korean manufacturers were found to have systematically overstated fuel economy results for over 1 million vehicles, largely the result of violating EPA’s prescribed test guidelines for determining vehicle road load.  This led to a $100 million fine and hundreds of millions of dollars in compensation for its customers.

Volkswagen—The reintroduction of diesels to its American fleet were found to come only as the result of a defeat device used to cheat the emissions tests.  Encompassing nearly 600,000 vehicles, it turns out that in the real world these vehicles emitted up to 40 times the legal limit of nitrogen oxides, a smog-forming pollutant.  Volkswagen is estimated to spend around $20 billion over the next few years in an effort to remove these polluting vehicles from the road, mitigate the excess pollution caused by these vehicles, and compensate the American people for this egregious violation.

The above issues represent a real cost to consumers, the environment, and public health and they required rigorous laboratory and on-road testing to investigate the issue.  If anything, these recent enforcement actions by EPA show the need and value of investing in even more complementary real-world testing, not less. It seems absurd to cut in half the number of staff at the lab responsible for these tests.

The Lab has also been a vital tool for transparent assessment of vehicle regulation

In addition to its important role as industry watchdog, the Lab has played a key role in assessing the technological capability of the automotive industry and providing transparency to the development of fuel economy and emissions standards.

Throughout the regulatory process, the EPA has used the capabilities of the Vehicle Lab to assess the technology landscape, publishing its results and making freely available pages upon pages of detailed technical information.  This data was used not just to test the technologies of today but to actually create, develop, and benchmark a publicly accessible full vehicle simulation model to simulate the technologies of tomorrow.  This is the type of tool previously only available to manufacturers and some well-funded institutions and, until now, well out of the budget of an organization like UCS.

This wealth of information can help inform researchers like myself and others looking to promote improvements and investments in technologies to reduce fuel use, and it provides an unparalleled level of detail and transparency for assessing the validity of regulations based on this information.

In a comprehensive report, the National Research Council of the National Academies of Science, Engineering, and Medicine noted that “the use of full vehicle simulation modeling in combination with lumped parameter modeling has improved the Agencies’ estimation of fuel economy impacts.  Increased vehicle testing has also provided input and calibration data for these models.  Similarly the use of teardown studies has improved [NHTSA and EPA’s] estimates of costs.”

Every single item lauded by the National Academies was conducted in collaboration with the researchers at the Vehicle Lab the Trump administration is now proposing to gut.

Cutting funding cuts corners, jobs and puts us at risk of a rubber stamp EPA

The current administration plan would immediately cut the number of people working at the Lab in half—that means that rather than increasing the ability for the agency to protect against the types of industry malfeasance documented above, the Lab would be stripped of its capabilities in the near-term.  This reduction in workforce would make it impossible to even maintain the bare minimum of checks and balances on the certification program, even if (big IF!) it were eventually fully funded by fees from manufacturers.

This vehicle test cell is used to measure a vehicle’s emissions in order to assess its operation under cold weather conditions. This is a necessary component to ensure that pollution levels under all driving conditions are below legal limits, and fuel usage under these conditions is part of the test procedure which determines a vehicle’s fuel economy label for consumers.

Furthermore, the fee proposal in the budget is completely inadequate to the task.  While the EPA already collects fees to reimburse the Agency, in part, for its certification activities, it is Congress which determines how the fees are appropriated—to date, Congress has not been appropriating this money to EPA, instead using these funds to offset the federal budget deficit.  There is no reason to suppose that this would change in the future, which means this proposal would effectively gut the certification process by cutting the staff responsible for the program in half.

With such a drastic staff reduction, effective immediately in 2018, the certification process will be gummed up to such a degree it will either delay sales of vehicles tremendously or become a meaningless rubber stamp which will undoubtedly lead to even more automaker malfeasance, further eroding the trust of the American people in its auto industry.

Ensuring a technically sound watchdog is of course in the interest of the auto industry as well.  It ensures everyone is playing by the same rules and that they suffer the consequences if they don’t. While engineers at other auto companies were working hard to develop emission controls for diesel cars, VW was making millions, selling so-called “clean diesels” by the hundreds of thousands.

So I hope the Alliance of Automobile Manufacturers and the Association of Global Automakers call out this farcical budget memo for what it is—a slap in the face of good governance that can only result in adverse health and environmental impacts for the American people and end up a costly mistake for the auto industry as well.

US EPA US EPA

Five Black Public Health Champions You Should Know

UCS Blog - The Equation (text only) -

In honor of National Public Health Week, we’re paying tribute to some outstanding individuals in the public health field. But first—bear with me—a little historical context.

It’s no secret that here at UCS, we love science. It can help us define complex problems, identify the best methods to solve them, and (if we’ve done a good job) provide us with metrics for measuring the progress we’ve made.

Doctor injects subject with placebo as part of the Tuskegee Syphilis Study. Photo: National Archives and Records Administration.

But it would be both irresponsible and incredibly destructive to pretend that science operates in isolation from systems of deeply rooted racism and oppression that plague scientific, political, and cultural institutions in the United States—particularly when it comes to health. Such systems have been used to justify unfathomably cruel and inhumane medical experimentation performed on black bodies in slavery, which were only replaced in the Jim Crow era by pervasive medical mistreatment that resulted in untold fatalities. Racist medical practices were tolerated, if not explicitly condoned, by professional organizations such as the American Medical Association through the late 1960s. The government-funded Tuskegee Syphilis Study, which effectively denied syphilis treatment to nearly 400 black men over the course of 40 years, ended in 1972, but a formal apology was not issued for this deliberate violation of human rights until 1997. And still, in doctors’ offices and hospital rooms across the United States today, race remains a significant predictor of the quality of healthcare a person will receive.

This is, of course, deeply troubling. (And worthy of far deeper discussions than a blog post can provide—see a short list of book recommendations below.)

But perhaps just as troubling as the underpinnings of racism in science and medicine is its relative obscurity in the historical narratives propagated by dominant (read: white) culture. That modern medicine was built on the backs of marginalized populations is well understood and indeed has been lived by many, but it is far from being accepted as universal truth. Meanwhile, the contributions of black scientists, doctors, and health advocates have routinely been eclipsed by those of their white colleagues or are absent entirely from historical records. (At least until Hollywood spots a blockbuster.)

Public health advocates and practitioners have a responsibility both to understand this complex history of medical racism, if they have not already experienced it firsthand, and to thoroughly integrate its implications into their daily work. This includes acknowledging the tensions that may stem from deep distrust of the medical community by communities of color; considering the multiple ways in which implicit bias and institutional racism may impact social determinants of health, risk of chronic disease, access to care, and quality of treatment; applying a racial equity lens to policy and program decision-making; and, last but not least, giving credit where it’s due.

Today, my focus is on that last point. Though public health is not necessarily a discipline that generates fame or notoriety (it has been said, in fact, that public health is only discussed when it is in jeopardy), you should know the names of these five black public health champions. Some past, some present, some well-known and some less so, they are all powerful forces who have made significant contributions to this field.

Have other names we should know? Leave them in the comments.

1.  Dr. Regina Benjamin, former U.S. Surgeon General

Photo: United States Mission Geneva/CC BY SA (Flickr)

During the four-year term she served as the 18th U.S. Surgeon General (2009-2013), Regina Benjamin shifted the national focus on health from a treatment-based to a prevention-based perspective, highlighting the importance of lifestyle factors such as nutrition, physical activity, and stress management in the prevention of chronic disease. Other campaigns during Dr. Benjamin’s term targeted breastfeeding and baby-friendly hospitals, tobacco use prevention among youth and young adults, healthy housing conditions, and suicide prevention. Prior to serving as the Surgeon General, Dr. Benjamin established the Bayou La Batre Rural Health Clinic on the Gulf Coast of Louisiana, providing care for patients on a sliding payment scale and even covering some medical expenses out of her own pocket. Dr. Benjamin has been widely recognized for her determination and humanitarian spirit.

2.  Byllye Avery, founder of the Black Women’s Health Imperative and Avery Institute for Social Change

Despite the passage of Roe v Wade in 1973, access to abortions remained limited in the years thereafter, particularly for many black women. Byllye Avery began helping women travel to New York to obtain abortions in the early 1970s, and in 1974 co-founded the Gainesville Women’s Health Center to expand critical access to abortions and other health care services. In 1983, Avery founded the National Black Women’s Health Project (now called the Black Women’s Health Imperative), a national organization committed to “defining, promoting, and maintaining the physical, mental, and emotional wellbeing of black women and their families.” Avery has received numerous awards for her work, including the Dorothy I. Height Lifetime Achievement Award (1995), the Ruth Bader Ginsberg Impact Award from the Chicago Foundation for Women (2008), and the Audre Lorde Spirit of Fire Award from the Fenway Health Center in Boston (2010).

3.  Bobby Seale, co-founder of the Black Panther Party

Photo: Peizes/CC BY SA (Flickr)

Here’s a name you might know—and a story that might surprise you. While the Black Panther Party, co-founded by Bobby Seale and Huey Newton in 1966, is often remembered for its radical political activism, the black nationalist organization was also deeply engaged in public health work. True to their rallying call to “serve the people body and soul,” the Black Panthers established over a dozen free community health clinics nationwide and implemented a free breakfast program for children. This program, which served its first meal out of a church in Oakland, California in 1968, was one of the first organized school breakfast programs in the country and quickly became a cornerstone of the party. By 1969, the Black Panthers were serving breakfast to 20,000 children in 19 cities around the country. Though the government eventually dismantled the program along with the party itself, many believe it was a driving factor in the establishment of the School Breakfast Program in 1975.

4.  Dr. Camara Jones, former president of the American Public Health Association

As the immediate past president of the American Public Health Association, Dr. Camara Jones brought the impact of racism on health and well-being to the forefront of the public health agenda. She initiated a National Campaign Against Racism, with three strategic goals: naming racism as a driver of social determinants of health; identifying the ways in which racism drives current and past policies and practices; and facilitating conversation, research, and interventions to address racism and improve population health. Dr. Jones has also published various frameworks and allegories, perhaps the most famous of which is Levels of Racism: A Theoretic Framework and a Gardener’s Tale, to help facilitate an understanding of the nuance and layers of racism across the general population.

5.  Malik Yakini, founder of the Detroit Black Community Food Security Network

Photo: W.K. Kellogg Foundation/CC BY SA (Flickr)

Malik Yakini may not see himself as a public health advocate, but that hasn’t stopped him from receiving speaking requests from prominent public health institutions across the country. A native Detroiter, Yakini views the food system as a place where inequities play out at the hand of racism, capitalism, and class divisions. “There can be no food justice without social justice,” he said to an audience at the Bloomberg School of Public Health at Johns Hopkins. “In cities like Detroit where the population is predominantly African American, we are seen as markets for inferior goods.” Yakini founded the Detroit Black Community Food Security Network in 2006 to ensure that Detroit communities could exercise sovereignty and self-determination in producing and consuming affordable, nutritious, and culturally appropriate food. The organization operates the seven-acre D-Town Farm on Detroit’s east side and is now in the process of establishing the Detroit Food People’s Co-op.

Recommended Reads

Black Man in a White Coat by Dr. Damon Tweedy

The Immortal Life of Henrietta Lacks

Body and Soul: The Black Panther Party and the Fight against Medical Discrimination

 

Made in America: Trump Embracing Offshore Wind?

UCS Blog - The Equation (text only) -

While publicly pushing fossil fuels, the Trump administration seems to be quietly embracing offshore wind power and its economic potential. 

In March, the Interior Department auctioned off 122,405 acres of water off Kitty Hawk, North Carolina, to the Spanish-based Avangrid for $9 million. Avangrid, a division of Iberdrola, beat out three competitors, including Norway’s Statoil and Germany wind farm developer wpd.

Interior Secretary Ryan Zinke hailed the auction, affirming that offshore wind is “one tool in the all-of-the-above energy toolbox that will help power America with domestic energy, securing energy independence, and bolstering the economy. This is a big win.”

That followed the equally stunning announcement a week prior by Interior’s Bureau of Ocean Energy Management that it plans to stage another competitive lease auction in 400,000 acres of New England waters, triggered by unsolicited applications for the same area by Statoil and the U.S. wing of Germany’s PNE Wind.

The parcels are adjacent or near areas off Massachusetts and Rhode Island already leased by Denmark’s DONG Energy, Germany’s OffshoreMW and Rhode Island’s Deepwater Wind.

Those two developments signal that the Trump administration takes the economic potential of offshore wind energy far more seriously than might be assumed from the president’s past disparagement of wind turbines. Trump told the New York Times shortly after his election, “We don’t make the windmills in the United States. They’re made in Germany and Japan.”

Already big business in US

But it may have dawned on the Trump administration that offshore wind is actually much more an American industry than most people realize.

In 2015, Boston-based General Electric made the biggest purchase in its history, acquiring the French energy infrastructure giant Alstom for $10.6 billion. The deal included Alstom’s offshore wind turbine manufacturing operations, including a plant in Saint Nazaire, France, that made the five turbines spinning in the U.S.’s first offshore wind farm, the 30-megawatt Deepwater Wind Block Island project.

GE proceeded last year to purchase the world’s largest turbine blade manufacturer, Danish-based LM Wind, for $1.65 billion. Last month, the LM Wind division announced it is building a blade manufacturing plant in the Normandy region of France, providing at least 550 direct and 2,000 indirect jobs as that nation ramps up its offshore industry. The factory will be capable of making the longest turbine blade in the world, nearly 300 feet long, for new-generation 8 MW turbines.

Besides GE, New York-based Blackstone, one of the world’s top investment firms, was behind the 2011 funding of Germany’s 80-turbine, 288-MW Meerwind offshore wind farm. Blackstone, with the help of Bank of America Merrill Lynch, last year sold its 80 percent stake to Chinese investors.

New York-based Goldman Sachs also has a 7 percent stake in DONG, the first company to cross the 1,000-turbine mark. Europe has a total of 3,600 turbines spinning, providing 12.6 Gigawatts of power, enough for 13 million homes, according to industry advocate Wind Europe.

Photo: Derrick Jackson

Critical mass close

It is clear that the offshore wind industry now wants to cross the water, with rocket-sized components that are too long and too massive to economically import long term from Europe. If it does, it could easily blow to our shores the skilled local construction and technical jobs and large-scale manufacturing President Trump has promised.

Deepwater Wind was recently cleared to begin a 15-turbine project off Montauk, Long Island, in waters where Deepwater could eventually construct up to 200 turbines. In December, Statoil won the federal lease for a 79,000-acre area of ocean off Long Island’s Jones Beach for a record $42.5 million.

Besides the competition in North Carolina, Maryland is in the approval stage of offshore wind proposals. And with Massachusetts now mandating 1,600 MW of offshore wind in its energy portfolio by 2027 and with New York Governor Andrew Cuomo pushing for 2,400 MW of offshore wind by 2030, the U.S. is about to become part of “the brightest spot in the global clean energy investment picture,” as Bloomberg New Energy Finance put it.

Job engine, port revivals

The inspiration points in Europe are endless. Last year saw a record $26 billion of investments, as the industry is on track to double the 12.6 GW by 2020.

The United Kingdom has approved construction of the largest wind farm yet, the 174-turbine, 1.2 GW Hornsea One Array. DONG says it expects Hornsea to generate 2,000 jobs during the construction phase and 300 operational jobs thereafter.

DONG and the British government have begun planning a second wind Hornsea wind farm that would be even bigger, 300 turbines and 1.8 GW, adding another 2,000 construction and nearly 600 maintenance jobs.

In Germany, the offshore wind industry is responsible for nearly halving unemployment in Bremerhaven and Cuxhaven, towns northwest of Hamburg, were hit hard in the late 20th century by the decline of fishing, shipbuilding and the closing of US military facilities. Local officials likened Bremerhaven to Detroit for its 25-percent unemployment rate.

Today, with a downtown core gleaming with new museums and hotels, those same officials call offshore wind their regional “moon shot.” Up in Cuxhaven, Siemens is putting the finishing touches on a giant turbine plant that should go into operation in the middle of this year, bringing yet 1,000 more jobs to the region, adding to the 20,000 jobs claimed by the German offshore wind industry.

Denmark, despite having only the population of Massachusetts, remains a per-capita titan in offshore employment with 10,000 jobs. The UK, which has 41 percent of Europe’s installed capacity, had at least 30,000 direct and indirect jobs, according to UK Renewables and is obviously adding thousands more with oncoming projects such as Hornsea.

In December, Siemens completed a $381 million turbine-blade plant in Hull that will employ 1,000 people when fully operational.

Unlike much of modern manufacturing, The Guardian’s story on the plant’s opening noted: “Surprisingly, the manufacturing process is almost entirely done by hand, rather than robots. The workforce includes former supermarket workers, aerospace industry experts on second careers and builders who learned fiberglass skills locally from fitting bathrooms and making caravan parts.”

And Hull and other British port towns, according to newspaper features, are experiencing rebounds akin to Bremerhaven and Cuxhaven. A January Sunday Express story recalled how Hull declined from the overfishing of cod into a “rundown backwater” that topped the list of worst places in the UK to live in 2003. With redevelopment strategies that included the investment of offshore companies like Siemens, the city has rebounded to be a popular tourist destination.

Grimsby, a 33-mile drive from Hull, already has 1,500 offshore wind jobs and, with the planned Hornsea projects, has plans to grow and become the biggest offshore wind industry cluster in the world. DONG said in 2015 it plans to invest $7.4 billion in the Grimsby/Hull region by 2019.

Elsewhere in the UK, another massive offshore wind project, the 102-turbine, 714 MW East Anglia One, promises 3,000 jobs.

Photo: Derrick Jackson

The American potential

The building of house-sized nacelles and football-field-length blades, the manufacture and laying of miles of underwater cables, the building of jack-up installation barges and maintenance vessels, the welding of foundations and towers, port rehabilitation and all the nuts and bolts in between should rise from its current 75,000 jobs to between 170,000 and 204,000 jobs, according to Wind Europe.

A recent New York Times feature on the industry said, “Offshore wind, once a fringe investment, with limited scope and reliant on government subsidies, is moving into the mainstream.”

According a joint report last September by the Department of the Interior and the Department of Energy, a robust U.S. offshore wind industry could employ up to 34,000 workers by 2020, up to 80,000 by 2030 and up to 181,000 by 2050.

The industry would be making $440 million in annual lease payments and $680 million in annual property tax payments for local economies. Better still, a University of Delaware study last year calculated that just 2 GW of projects in the pipeline in Massachusetts waters would ignite such an efficient local industry supply chain that the price of offshore wind energy should be even with other energy options by 2030.

“At that point, the technology presumably could continue to compete on its own without any continuing legislation,” the study said.

Onshore bipartisan success

The onshore wind industry is now at such cost parity that it is booming across America, from liberal California to the conservative Great Plains and Texas. In fact, 80 percent of U.S. wind farms are in Republican congressional districts, according to the American Wind Energy Association.

Wind energy surpassed hydroelectric power in generating capacity for the first time last year.

According to AWEA, the U.S. counterpart to Wind Europe, there are now more than 500 blade and turbine factories and supply-chain manufacturing facilities making the 8,000 different parts that go into one machine.

Domestic wind industry jobs have crossed the 100,000 mark and the Bureau of Labor Statistics lists wind service technician as the fastest-growing job through 2024, with a current median pay of $51,050.

Wind service technicians are a huge reminder that this is an industry where many jobs are skilled working-class crafts that can be learned in technical colleges, providing a fresh employment pathway for individuals, families and low-income communities where 4-year college is often seen as unaffordable.

Despite his planned sweeping rollbacks of environmental regulations he decries as “job killing,” offshore wind offers exactly the kinds of jobs President Trump has said he would bring back to areas of America where other forms of manufacturing have disappeared.

The Perry Factor?

Another reason for optimism for offshore wind during the Trump administration is that Secretary of Energy Rick Perry oversaw Texas becoming the nation’s leader in onshore wind when he was governor.

Today, 12,000 turbines provide 13 percent of the state’s electricity, powering 4 million homes, and providing more than 24,000 jobs, according to AWEA.

The state’s transmission grid completed a $7 billion upgrade to accommodate wind. As governor, Perry boasted that Texas as a nation would rank sixth in the world in onshore wind installed capacity.

Late in his administration, he began to invest in offshore. In 2014, his Texas Emerging Technology Fund awarded $2.2 million to Texas A&M University to explore offshore wind. That grant was matched with $64 million of federal and industry research investments.

When Mr. Perry was confirmed as Energy Secretary, AWEA CEO Tom Kiernan said, “The Texas success story with wind power has now become a model for America … we look forward to working with him at the Department of Energy to keep this success story going.”

The first signs are that the success story will include offshore wind, spinning with jobs, and revitalizing towns dimmed with decline.

Without officially saying so, the Trump administration is deciding that the windmills can be made here after all.

This post first appeared on The Daily Climate.

Photo: Derrick Jackson Photo: Derrick Jackson Photo: Derrick Jackson

North Korea’s 5 April 2017 Missile Launch

UCS Blog - All Things Nuclear (text only) -

North Korea launched a missile from its east coast into the Sea of Japan at 6:12 am local time on March 5 (5:42 pm on April 4 US eastern time).

US Pacific Command initially identified it as a KN-15 missile, called Pukguksong-2 in North Korea, which is a two stage solid-fueled missile with an estimated range of 1,200 km based on its previous test in February.

Subsequently, however, Pacific Command said it believed the missile was instead an older Scud, and that it may have tumbled, or “pinwheeled,” during flight.

South Korean sources reported the missile flew only about 60 km before splashing down, and reached a maximum altitude of 189 km. And based on Pacific Command’s statement, the flight time was eight to nine minutes.

I used those numbers to investigate the trajectory with a computer model I have of several missiles.

Short-range Scud missile

I found that a Scud missile, with a nominal range of 300 km, could roughly match these numbers if the warhead was lightened somewhat (from 1,000 kg to about 700 kg) and if it was launched on a very lofted trajectory, with a burnout angle only about 5 degrees from vertical. On a 300-km range trajectory, this angle would be roughly 45 degrees (see Fig. 1).

If the missile did not tumble during reentry, I calculate the flight time would be about 7.5 minutes. However, taking account of the additional atmospheric drag due to the tumbling body can increase the flight to about 9 minutes.

Fig. 1

Other possibilities

In the calculation above, the Scud burns to completion and then begins to pinwheel (the short-range Scud does not separate the warhead from the missile body at burnout).

Longer range missiles could also follow this trajectory if the engines failed partway through powered flight, as long as the missile was on a highly lofted trajectory (5 degrees from vertical) and stopped accelerating after reaching a speed of 1.7-1.8 km/s. It may have been an engine failure that caused the missile to tumble. If the engines did not burn to completion, the warhead may have remained attached to the missile body even for a longer range missile that would separate the warhead under normal operation.

The fact that the missile flew on a nearly vertical trajectory suggests there may have been a problem with the guidance system. If the missile was liquid fueled, North Korea may have shut down the engine when it realized there was a problem. A solid fueled engine could not be shut down in the same way.

Is the missile had been a KN-15, the engine would have had to fail about halfway through the burn of the second stage engine. It seems surprising that initial reports identified the missile as a KN-15, since I would have expected sensors could tell whether or not the missile had undergone staging. In addition, the plumes from liquid and solid missiles are different in appearance, so depending on what sensors viewed the launch they should have been able to differentiate a liquid from solid missile. Analyzing these issues may have been what led Pacific Command to change its mind about what type of missile was launched.

Why fire a Scud?

If Pyongyang decided to launch a missile to attract attention in advance of the Trump-Xi summit that starts tomorrow, it may have decided to launch some type of Scud because these are well tested and it could be relatively assured the launch would be successful. The missile may have been a Scud-ER like the four it launched simultaneously in early March.

That fact that it appears to have failed illustrates how uncertain the missile business can be.

Despite Trump’s Climate Rollbacks, Renewables Charging Full Steam Ahead

UCS Blog - The Equation (text only) -

President Trump’s recent Executive Order on Energy Independence is a cynical and dangerous assault on common sense policies to address climate change. His efforts will put Americans in harm’s way, and we must resist the president’s anti-science agenda at every turn. One of those turns is in our nation’s power sector, where the transition away from coal and toward cleaner, lower-carbon energy resources is well underway. Solar and wind power, especially, have experienced record growth in recent years, and there are multiple avenues—through utilities, states, corporations, and individuals—to keep the momentum going, with or without President Trump’s support.

It’s the market, stupid

Non-hydro renewable energy sources accounted for nearly 9 percent of our nation’s power supply in 2016, more than double 2010 levels. Since 2010, more than 86,500 megawatts (MW) of new wind and solar power capacity has come online, far more than their fossil fuel competitors. In fact, 2016 marked the first year that more solar power capacity was installed—14,762 MW—than any other power source.

Much of this rapid development has been aided by state policies and federal incentives, but simple market economics is playing an increasingly important role. Costs for wind and solar have dropped so dramatically in recent years that a recent comparison of power sources shows new wind and solar to be cheaper than new fossil fuel generation. As a result, more and more utility planners are opting to add renewables—and close aging coal generators—based largely on economics.

Consider Xcel Energy’s recent announcement to build 11 wind projects in seven states, totaling 3,380 MW of new capacity. In a statement Xcel executive David Hudson said, “The decision to add additional wind generation is purely in the economic interest of our customers.”

New Mexico’s largest utility, PNM, also recently released an analysis showing that closing their San Juan coal plant would result in “long-term benefits for consumers” and provide “an opportunity to increase renewable energy production.”

And in Ohio, Dayton Power & Light announced in March it will close two coal plants because they “will not be economically viable beyond mid-2018.” The utility also plans to invest in at least 300 MW in new wind and solar projects over the next five years.

‘Yuge’ competition among states

In addition to today’s market forces, policy drivers have been—and will continue to be—critical to ensure the swift transition to a renewable energy economy. And with the Trump Administration laying waste to federal solutions, the onus on states to step up and deliver has never been greater. Fortunately, many states are rising to the challenge through increasingly stronger renewable electricity standards (RES).

Indeed, there is stiff competition brewing among states to be a national leader in terms of commitment to renewable energy development. Just a few years ago, having a target of 25 to 30 percent of its electricity coming from renewable sources would put a state among the pack of leaders. Today, six of the 29 states with existing RES policies have requirements of at least 50 percent, including Hawaii, which has set its sights on achieving 100 percent renewables by 2045.

During this legislative season, at least eight states have actively pursued significantly stronger targets. Among them are three states—California, New York, and Massachusetts—that are seeking to match Hawaii’s 100 percent target. Even in a more conservative state like Nevada, legislators are considering an increase in their RES from 20 percent to 50 percent by 2030.

If successful, these collective state actions will help ensure there is a robust market for renewables over the long term.

This Bud’s for you!

It’s not just states and prudent utilities that are driving the renewable energy revolution. Corporate demand for renewables is also a rapidly expanding market opportunity in the clean energy industry. In 2015, corporate power purchase agreements for wind outpaced new wind investments by utilities for the first time in the United States, according to the Rocky Mountain Institute (RMI). RMI also estimates that at least 60,000 MW of new wind and solar will be needed by 2025 to serve the US corporate market.

Competitive pricing and increasingly stringent sustainability goals are leading many of the largest U.S. (and global) corporations to invest directly in renewable energy. A recent Advanced Energy Economy survey found that nearly half of all Fortune 500 companies (and 70 percent of Fortune 100 companies) have set renewable energy or sustainability targets. Of this list, at least 23 corporations have set renewable energy goals of 100 percent, including giants like Amazon and Walmart.

Anheuser-Busch InBev, makers of Budweiser beer, has joined the growing list of companies committing to sourcing 100 percent of their power needs from renewable energy. Photo: Jack Snell CC BY-NC-SA 2.0

The latest multi-national company to make a 100 percent renewable energy commitment is Anheuser-Busch InBev, makers of Budweiser and Corona beers, among others. In rolling out its announcement, the company said, “We do not expect our cost base to increase. Renewable electricity is competitive with or cheaper than traditional forms of electricity in many markets.” We can all raise our glasses to that!

(Renewable) Power to the People

Citizens all across the country also have the power to stand up against the President’s climate rollbacks and demonstrate their support for renewable energy. Thanks to a combination of falling costs and state and federal incentives, solar PV installations in the residential sector have experienced steady growth over the last six years. At the end of 2016, there were 1.3 million solar households in the United States, more than twice the number from 2014! California leads all states with a 35 percent share of the solar PV market, but all states have solar homes and tremendous potential to grow.

What’s more, you don’t need to be a homeowner to get in on the renewable energy revolution. Community solar is an exciting and burgeoning option for consumers where investing in a rooftop system may not be a viable option. In addition, anyone can sign-up for certified green power either through their utility’s green power pricing program (if they have one) or through a national green power marketer.

Despite President Trump’s misguided actions to undermine climate progress, we must keep pressing forward toward a clean and low-carbon energy future. Thanks to the emergence of wind and solar as affordable and reliable sources of power, we can.

The Importance of Public Funding for Earthquake Hazard Research in Cascadia

UCS Blog - The Equation (text only) -

In 2015, the New Yorker published “The Really Big One”, a story that brought public awareness to the dangers posed by the Cascadia subduction zone. The Cascadia subduction zone is a large fault that lies underwater, just off the coasts of Washington, Oregon, and Northern California. As a scientist and professor who researches this fault and its dangers, I really appreciated the large impact this article had in raising awareness of the importance of preparing for the next large earthquake here, especially among the many residents who live in this region. The New Yorker article, and plenty of ongoing scientific research, suggests that we need to prepare for the possibility of a major earthquake in this region—but we also need more research to help with this preparation.

Weighing the probabilities of earthquakes—room for uncertainty

Loma Prieta Earthquake damage on the Bay Bridge in California, 1989. Credit: Joe Lewis https://www.flickr.com/photos/sanbeiji/220645446

The Cascadia subduction zone has the capacity for a magnitude 9.0 earthquake, the same size as the devastating Japanese earthquake that occurred in 2011. The 2011 Japan earthquake caused a large tsunami, widespread destruction, and an ongoing nuclear disaster. We expect the next great Cascadia earthquake will have similar effects, hopefully minus the nuclear disaster. This fault directly threatens the urban areas of Seattle, Washington and Portland, Oregon, in addition to the many more residents in rural and suburban areas of California, Oregon, and Washington. In a 2013 report, The Cascadia Region Working Group estimates that if a magnitude 9.0 earthquake were to happen in the near future in this region, “the number of deaths could exceed 10,000”, and “more than 30,000 people could be injured”, with economic losses “upwards of $70 billion”.

It is very difficult to predict when this next great Cascadia earthquake will occur. A recent report published by the U. S. Geologic Survey estimates the probability of a magnitude 9.0 earthquake at roughly 10% in the next 50 years. The probability of a somewhat smaller, but still very destructive earthquake in the southern section of Cascadia (located just offshore, stretching from Cape Mendocino, CA to Florence, OR) is roughly 40% over the same timeframe.  These probabilities are high enough to be scary—and to indicate the urgency of preparing for a a major earthquake disaster in this region.

These probability numbers represent decades of scientific progress and breakthroughs in studies of fault behavior, but they are not as useful as they could be. What the public and emergency managers want to know is “Will a destructive earthquake occur in the next 50 years, or not?”. The best answer we currently have is these probabilities. What that really means is, “we don’t know, so prepare just in case”.

While the New Yorker article raised awareness, over time this fades and people go about their usual lives. It is really difficult to maintain vigilance making sure you are personally prepared for a major earthquake at all times for the next 50 years, especially when there’s a good chance nothing will happen. Therefore, it would be really great to put some more certainty in those probabilities. If we can revise these probabilities closer to 0% (no chance of an earthquake) or 100% (definitely going to be an earthquake) we can reduce uncertainty when planning for the future.

The public depends on earthquake research

EarthScope infrastructure across the United States. Credit: Jeffrey Freymueller

Increased certainty can only come from increased scientific understanding of this fault, and the mechanics of faults in general, which is at best only partially understood. We are also monitoring this fault for long-term changes that might indicate a large earthquake is imminent.

Making progress improving earthquake forecasts for Cascadia is a multi-disciplinary research problem. Scientists like myself use techniques such as numerical models of friction on faults to study the rupture process, laboratory experiments to study fault behavior, field geology studies to look at the signatures of past earthquakes, and data-driven studies using multiple instruments planted all along the subduction zone.

The vast majority of these studies are publicly funded using federal funding from the U.S. Geological Survey and National Science Foundation. The instruments we use were placed as part of a major scientific initiative called Earthscope, which was featured by Popular Science as the #1 “Most Ambitious Experiment in the Universe Today”. Earthscope is funded completely by the National Science Foundation, and funding is scheduled to end soon. The future of the critical scientific instrumentation in Cascadia is currently uncertain. These instruments have been, and continue to be, vital in improving our understanding of the mechanics of the Cascadia subduction zone and the size and timing of the next large earthquake there.

Budget cuts and uncertainty have a large effect on this field. The U.S. Geological Survey, under the recently released Trump budget blueprint, is going to take a 15% cut. The National Science Foundation is not specifically mentioned in the blueprint, but the working assumption among scientists is a 10% cut. While the cuts certainly hinder our efforts to study the Cascadia subduction zone, even the uncertainty is a hindrance to this science, as funding proposals take 6 months or more to receive an answer because of budget uncertainty. For scientists to do our jobs and give emergency managers and the public the best available information, it is critical that we continue to receive federal research funding.

 

Noel M. Bartlow is an Assistant Professor in the Department of Geological Sciences at the University of Missouri. She is a geophysicist who studies slow earthquakes and frictional locking in subduction zones. She earned her Ph.D. in Geophysics from Stanford University in 2013, and completed a postdoctoral fellowship at the University of California–San Diego’s Scripps Institution of Oceanography before joining the University of Missouri faculty in 2016.  She is currently the principal investigator for the National Science Foundation funded project, “Collaborative Research: Improving models of interseismic locking and slow slip events in Cascadia and New Zealand.”

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

Why Senator Lankford’s “BEST Act” Is Really the Worst for Federal Science

UCS Blog - The Equation (text only) -

A few weeks ago, Sen. James Lankford (OK) introduced legislation called the “Better Evaluation of Science and Technology Act,” or “BEST Act” for short. The proposal takes the scientific standards language from the recently updated Toxic Substances Control Act (TSCA) and applies it to the Administrative Procedures Act (which governs all federal rulemaking). Sen. Lankford claims the BEST Act would guarantee that federal agencies use the best available science to protect public health, safety, the environment, and more.

Nice sound bite, right?

In practice, though, this bill would cripple the ability of agencies like the Environmental Protection Agency (EPA) and the Consumer Product Safety Commission (CPSC) to rely on scientific evidence to issue public health and safety safeguards. It’s just as radical as the numerous other bills that would enable politics to trump science, making all of us more vulnerable to public health and environmental threats.

How this works in the real world

How would it do that? It’s simple really. If you look at the bill language carefully, it consists of significant legal jargon and imprecise language that any lawyer worth his or her salt could use to shut down science-based decision making at federal agencies by tying up the rule-making process in endless challenges.

Let’s take a look at lead. The science is clear here. There is no safe blood level of lead. According to the EPA, lead poisoning can cause slowed growth, lower IQ, behavior and learning problems, and more. In the 1970s, it became increasingly clear that lead exposure resulted in negative health effects.

Science is a critical component of policymaking, but Senator Lankford's BEST Act is a solution in search of a problem.

Science is a critical component of policymaking, but Senator Lankford’s BEST Act is a solution in search of a problem.

Rather than accept the growing weight of this scientific evidence, the lead industry started to use manufactured counterfeit science to cast doubt on the impacts of lead exposure and the acceptable amount of lead in blood.

Now if the “BEST Act” had been the law of the land when the federal government began to regulate lead, the lead industry could have used this counterfeit science to challenge EPA regulations on the grounds of “degree of clarity” and “variability and uncertainty” (among other things), forcing the agency into endless litigation over settled science. This could have ultimately prevented the agency from limiting lead exposure, especially among vulnerable populations like children.

Likewise, the tobacco industry would have been able to cast doubt on the link between cigarettes and lung cancer.

The list goes on. Today, you can imagine the fossil fuel industry using the vague language to attack climate science as a justification for slowing down solutions that prevent global warming.

Heavy on problems, light on solutions

The ambiguity of the text should be enough to realize that this legislation is bad news for evidence-based decisionmaking. But there are several other issues with the legislation as well.

One major concern is subsection (h), which would result in an enormous resource drain for agencies at a time when budgets are decreasing. Agencies would be required to divert additional resources to make public a number of documents and information, which, as we know from our fight against the HONEST Act, costs time and money.

Another major issue is the fact that this legislation would freeze science standards the way they are right now, killing the innovation and flexibility that agencies have now to consider new forms of research in their decisionmaking. As agencies begin to regulate new technologies like autonomous vehicles, they need to have the ability to consider the most cutting-edge research out there, which might include new scientific methods and models.

More importantly for human health, as the EPA looks to implement the updated chemical safety law, it needs to have the ability to utilize the best and most up-to-date scientific and technical information without having to worry about being sued, which was the problem with the original TSCA bill. Under the original chemical safety law passed in the 1970s, the EPA could not even regulate asbestos, a known carcinogen, because industry kept suing the agency. If the BEST Act were to become law, we could expect more of the same.

A wasted opportunity

Agencies are already basing their policy decisions on the best available science. They have to. If an agency did not issue a public health protection or a worker safety standard based on strong evidence, then the agency would be challenged in court, and probably forced to vacate the regulation.

Instead of promoting legislation like the BEST Act, what Sen. Lankford could do to improve the use of science in policymaking is ensure that agencies like the EPA, CPSC, Department of Energy, and others, are well funded and have the resources necessary to fulfill their respective science-based missions.

There is no disagreement among anyone (well, almost anyone) that science has an important role to play in federal policymaking and that the decisions made by agencies to implement the Clean Air Act, the Endangered Species Act, the Consumer Product Safety Act, and others, all need to be rooted in the best scientific and technical information that is available. We all want science to help ensure that our health and safety are protected, that the drugs and medical devices we use are safe and effective, that the food we eat is free of disease, that our drinking water is clean, and more.

If anything, the BEST Act would take science out of the hands of scientists, and into the hands of politicians, lawyers, and judges. Sen. Lankford’s legislation is misguided and simply a solution in search of a problem. While there is always more to learn about a scientific issue, the ideas in this proposal should not be used as an excuse not to act and protect the public from public health, safety, and environmental threats.

Leak at the Creek: Davis-Besse-like Cooling Leak Shuts Down Wolf Creek

UCS Blog - All Things Nuclear (text only) -

The Wolf Creek Generating Station near Burlington, Kansas has one Westinghouse four-loop pressurized water reactor that began operating in 1985. In the early morning hours of Friday, September 2, 2016, the reactor was operating at full power. A test completed at 4:08 am indicated that leakage into the containment from unidentified sources was 1.358 gallons per minute (gpm). The maximum regulatory limit for was such leakage was 1.0 gpm. If the test results were valid, the reactor had to be shut down within hours. Workers began running the test again to either confirm the excessive leak or determine whether it may have been a bad test. The computer collects data over a two-hour period and averages it to avoid false indications caused by momentary instrumentation spikes and other glitches. (It is standard industry practice to question test results suggesting problems but accept without question “good” test results.)

The retest results came in at 6:52 am and showed the unidentified leakage rate to be 0.521 gpm, within the legal limit. Nevertheless, management took the conservative step of entering the response procedure for excessive leakage. At 10 am, the operators began shutting down the reactor. They completed the shutdown by tripping the reactor from 30 percent power at 11:58 am.

Wolf Creek has three limits on reactor cooling water leakage. There’s a limit of 10 gpm from known sources, such as a tank that collects water seeping through valve gaskets. The source of such leakage is known and being monitored for protection against further degradation. There’s a stricter limit of 1 gpm from unknown sources. While such leakage is usually found to be from fairly benign sources, not knowing it to be so imposes a tighter limitation. Finally, there’s the strictest limit of zero leakage, not even an occasional drop or two, from the reactor coolant pressure boundary (i.e., leaks through a cracked pipe or reactor vessel weld. Reactor coolant pressure boundary leaks can propagate very quickly into very undesirable dimensions; hence, there’s no tolerance for them. Figure shows that the unknown leakage rate at Wolf Creek held steady around one-tenth (0.10) gallon per minute during July and August 2016 but significantly increase in early September.

Fig. 1 (Source: Freedom of Information Act response to Greenpeace)

The reactor core at Wolf Creek sits inside the reactor vessel made of metal six or more inches thick (see Figure 2). The reactor vessel sits inside the steel-reinforced concrete containment structure several feet thick. The dome-shaped top, or head, of the reactor vessel is bolted to its lower portion. Dozens of penetrations through the head permit connections between the control rods within the reactor core and their motors housed within a platform mounted on the head. Other penetrations allow temperature instruments inside the reactor vessel to send readings to gauges and computers outside it.

Fig. 2 (Source: Nuclear Regulatory Commission)

Wolf Creek has 78 penetrations through its reactor vessel head, including a small handful of spares. Workers entered containment after the reactor shut down looking for the source(s) of the leakage. They found cooling water spraying from penetration 77 atop the reactor vessel head. The leak sprayed water towards several other penetrations as shown in Figure 3. Penetration 77 allowed a thermocouple within the vessel to send its measurements to instrumentation.

Fig. 3 (Source: Wolf Creek Nuclear Operating Corporation)

The spray slowed and then stopped as the operators cooled the reactor water temperature below the boiling point. Workers performed a closer examination of the leakage source (see Figure 4) and its consequences. The reactor cooling water at Wolf Creek is borated. Boric acid is dissolved in the water to help control the nuclear chain reaction in the core as uranium fuel is consumed. Once water leaked from the vessel evaporated, boric acid crystals remained behind, looking somewhat like frost accumulation.

Fig. 4 (Source: Freedom of Information Act response to Greenpeace)

The spray from leaking Penetration 77 blanketed many neighbors with boric acid as shown in Figure 5. The vertical tubes are made from metal that resists corrosion by boric acid. The reactor vessel (the grayish dome-shaped object on the left side of the picture) is made from metal that is considerably less resistant to boric acid corrosion. The inner surface of the reactor vessel is coated with a thin layer of stainless steel for protection against boric acid. The outer surface is only protected when borated water doesn’t leak onto it.

Fig. 5 (Source: Freedom of Information Act response to Greenpeace)

The white-as-frost blankets coating the penetrations indicated little to no corrosion damage. But rust-colored residue in the Figure 6 pictures is a clear sign of corrosion degradation to the reactor vessel head by the boric acid. It may not be déjà vu all over again, but it’s too much Davis-Besse all over again. Boric acid corroded the Davis-Besse reactor head all the way down to the thin stainless steel liner. The NRC determined Davis-Besse to have come closer to an accident than any other US reactor since the March 1979 meltdown at Three Mile Island.

Fig. 6 (Source: Freedom of Information Act response to Greenpeace)

Fortunately, the degradation appears much worse in the pictures than it actually was. Actually, fortune had an ally at Wolf Creek that was missing at Davis-Besse. Both reactors exhibited signs that reactor cooling water was leaking into containment. The indicated leak rates at both reactors were below regulatory limits, except for one anomalous indication at Wolf Creek. Managers at Davis-Besse opted to dismiss the warning signs and keep the reactor operating. Managers at Wolf Creek heeded the danger signs and shut down the reactor. It’s not that they erred on the side of caution—putting nuclear safety first must never be considered an error. It’s that they avoided making the Davis-Besse mistake of putting production ahead of safety.

Wolf Creek restarted on November 21, 2016, after repairing Penetration 77, removing the boric acid, and verifying no significant damage to other penetrations and the reactor vessel head. But they also conducted refueling activities—already planned to require 55 days—during that 80-day period. The NRC closely monitored the response to the leakage and its repair and found no violations.

Davis-Besse chose production over safety but got neither. The reactor was shut down for over two years, generating no revenue but lots of costly repair bills. The reactor vessel head and other components inside the containment extensively damaged by boric acid corrosion were replaced. Many senior managers at the plant and in the corporate officers were also replaced. And the NRC fined the owner a record $5,450.000 fine for numerous safety violations.

Nuclear Safety Snapshot

Figure 7 shows the reactor vessel head at Wolf Creek without any boric acid blankets and corrosion. But the image I’ll remember about this event is neither this picture, nor the picture of the hole in Penetration 77, nor the picture of the boric acid blankets on adjacent penetrations, and nor the picture of rust-colored residue. It’s the mental picture of operators and managers at Wolf Creek who, when faced with Davis-Besse-like cooling water leak indications, responded unlike their counterparts by shutting the reactor down and fixing the problem rather than rationalizing it away. It’s an easy decision when viewed in hindsight but a tough one at the time it was made.

Davis-Besse made headlines, lots and lots of headlines, for exercising very poor judgment. Wolf Creek may not warrant headlines for using good judgment, but they at least deserve to be on the front page somewhere below the banner headline and feature article about today’s bad guys.

Fig. 7 (Source: Freedom of Information Act —response to Greenpeace)

Nuclear Safety Video

Unfortunately, the picture of Wolf Creek responding well to a safety challenge is a snapshot in time that does not assure success in facing tomorrow’s challenges.

Fortunately, the picture of Davis-Besse responding poorly to a safety challenge is also a snapshot in time that does not assure failure in facing future challenges.

Nuclear safety is dynamic, more like a video than a snapshot. That video is more likely to have a happy ending when the lessons of what worked well along with lessons from what didn’t work factor into decision-making. Being pulled away from bad choices is helpful. Being pushed towards good choices is helpful, too. Nuclear safety works best when both forces are applied.

The NRC and the nuclear industry made quite the hullabaloo about Davis-Besse. Why have they been so silent about Wolf Creek? It’s a swell snapshot that could help the video turn out swell, too.

Another Delay of Chemical Safety Rule Is Dangerous and Unwarranted

UCS Blog - The Equation (text only) -

Last week was just chock full of setbacks and assaults on our public protections coming out of Washington. You’ve probably heard about President Trump’s all-out attack on climate policy; EPA Administrator Pruitt got right on it.  No surprise there.  Then there was EPA’s decision not to ban a pesticide clearly linked to serious and long-term developmental effects on children’s brains and cognitive function.  But you may not have noticed another harmful decision coming out of the EPA – this one about its Risk Management Program (RMP) rule.

Maybe you have been fortunate enough NOT to have to worry about an explosion, fire, or leak from the over 12,000 facilities that use or store toxic chemicals in the U.S.  But many of our families and communities—especially communities of color or low income communities — are not so lucky.

In the last decade nearly 60 people died, approximately 17,000 people were injured or sought medical treatment, and almost 500,000 people were evacuated or sheltered-in-place as a result of accidental releases at chemical plants. Over the past 10 years, more than 1,500 incidents were reported causing over $2 billion in property damage.  According to whom? The EPA.  And these data don’t begin to capture the daily worry and anxiety of those living or working close to one of those facilities.

One would think that enhancing safeguards to prevent, prepare for, respond to, and manage risks of chemical accidents and releases from our nation’s most hazardous facilities would be a no brainer.  It’s not like we haven’t seen or read about catastrophic chemical incidents.  Like the Chevron Richmond Refinery fire in 2012 that sent 15,000 people to the hospital for emergency treatment.  Or the deadly explosion at the West Fertilizer Company in West, Texas in 2013 that killed 15 people and injured 200 more. Or the 2014 chemical spill in West Virginia that left thousands of residents and businesses without clean water.

I suspect the American public assumes our government views reducing the risk of chemical disasters as a critical priority. And it was making some progress.

The good

For years, community groups, environmental organizations, and labor groups had pressed and petitioned the federal government to adopt stronger measures to prevent chemical disasters.  Finally, and in the wake of several high profile incidents, President Obama issued an Executive Order (EO 13650) in 2013 directing the federal agencies to reduce risks associated with such incidents and to enhance the safety of chemical facilities. Updating EPA’s Risk Management Program rule (under the Clean Air Act’s chemical disaster provision) emerged as one of the top priorities for improving the safety of these facilities.  The EPA then embarked on a multi-year and rigorous process of public outreach, stakeholder engagement, formal requests for information, and notice and comment periods.  The outcome: an updated Risk Management Program rule that includes some common-sense provisions for covered facilities.  For example,

  • Investigating incidents that resulted in or could have resulted in a catastrophic release
    (a so-called “near miss”), including a root cause analysis;
  • Coordinating local emergency response plans, roles, and responsibilities, and conducting emergency response exercises;
  • Improving public access to chemical hazard information;
  • Engaging an independent third-party after a reportable accident to audit compliance; and
  • For three industries with the most serious accident records (oil refineries, chemical manufacturers, and pulp and paper mills), conducting a safer technology and alternative analysis to identify and evaluate measures that could prevent disasters.

The enhanced rule was scheduled to go into effect on March 14, 2017, with longer compliance periods for some provisions (as far out as 2022).

The bad

 In March, the EPA issued a 90 day administrative stay – delaying implementation of the rule to June 19, 2017. This followed receipt of petitions by industry groups and several states requesting reconsideration of the rule.  Who were these petitioners?  Some pretty powerful stakeholders.  The RMP Coalition whose members are … wait for it… the American Chemistry Council, the American Forest & Paper Association, the American Fuel & Petrochemical Manufacturers, the American Petroleum Institute, the U.S. Chamber of Commerce, the National Association of Manufacturers, and the Utility Air Regulatory Group.  Another petition came in from the Chemical Safety Advocacy Group (CSAG) – comprised of companies in the refining, oil, and gas, chemicals, and general manufacturing sectors.  Then came a third petition from 11 states, including Texas and West Virginia.

The ugly

Stay it again.  Attentive to these industrial interests, Mr. Pruitt’s EPA last week proposed a further delay to the effective date of the RMP amendments to February 19, 2019.  So, having waited years for enhanced chemical safety and security safeguards, and after an already lengthy and extensive public process required for rule-making, communities and families at risk of chemical disasters will now have to wait almost another two years while the agency reviews and reconsiders the Risk Management Program amendments.  This delay essentially buys the agency more time to figure out how to redo it or repeal it completely. Call me crazy, but I just don’t see the delay resulting in a rule that gets stronger and further strengthens public safety.  The regulated community doesn’t want that to happen, and they have a bigger war chest and easier access to regulators and decision makers than public interest community does.

Unleashing the power of the (little) people

 But here’s what we, the people, do have.  We have voice. We have votes.  We have on-the-ground stories to tell.  We also have local leaders, emergency responders, workers, and school teachers who can attest to the dangers and the need.

The EPA is holding a public hearing as part of its reconsideration on April 19, 2017 in Washington, DC.  And it is taking written comments until May 19, 2017.   No comment or story is too short or too unimportant to tell.  And EPA has to consider all comments as it fashions its response. Tell them a further delay is dangerous, unnecessary, and unconscionable.

You can submit written comments electronically to Docket ID No. EPA-HQ-OEM-2015-0725 at http://www.regulations.gov.  These written comments can be accompanied by multi-media submissions, (i.e., video, audio, photos — like maybe of your kids?).  While you’re at it, send a copy of your comments to your federal representatives to let them know that you expect their support for strong chemical safety rules and resistance to any effort to roll-back these and other public protections.

One of the facilities in question may be in your neighborhood – or near those you love.  You might not even know. But even if you’re fortunate enough to be some distance away and relatively safe from a chemical explosion, fire, or spill disaster, we all have a stake in public safety and health.  And know that UCS will be there with you.

 

 

Trump Administration Claims ‘No Evidence’ Afterschool Programs and Meals Work. Actually, There’s Plenty.

UCS Blog - The Equation (text only) -

When I sat down with Dr. Jacqueline Blakely to talk about her afterschool program at Sampson Webber Academy in Detroit, our conversation was interrupted. A lot. Parents dropped by to talk about their kids, kids dropped in to talk about their days, and the phone rang like clockwork. It didn’t take long for me to understand that there was something really good going on in this classroom.

“The kids get a hot supper, followed by homework help and an academic hour focused on math and science, and then enrichment—that’s when they do projects,” Dr. Blakely explained. “They’re on ‘fun with engineering’ now, but we’ve done a cooking class, learned how to put a car together, and soon we’ll get to do the NASA challenge. That’s when the kids build an underwater robot and send it through an obstacle course.”

With that in mind, maybe you’ll understand why I winced when I heard White House Office of Management and Budget director Mick Mulvaney’s comments to the press about afterschool programs and the meals they provide. “They’re supposed to be educational programs, right? That’s what they’re supposed to do. They’re supposed to help kids who don’t get fed at home get fed so they do better in school,” he said. “Guess what? There’s no demonstrable evidence they’re actually doing that.”

Omia and Orari participate in the afterschool program at Sampson Academy in Detroit. Photo: Jacqueline Blakely

Sampson is a 21st Century Community Learning Center (CCLC), a grant-funded program providing 1.8 million children in high-poverty areas with academic, STEM, and cultural enrichment activities during out-of-school hours, as well as snacks and hot meals. According to the budget blueprint released by the Trump administration last month, funding for these programs is set to be eliminated.

But make no mistake—it’s not because they don’t work for kids.

On the contrary, the most recent national performance data for the 21st CCLC program revealed substantial improvements in both student achievement and behavior. Combined state data indicated that over a third of regular attendees (36.5 percent) achieved higher grades in mathematics through program participation, and a similar number (36.8 percent) achieved higher grades in English. Teachers reported that 21st CCLC students increased homework completion and class participation by nearly 50 percent, and over a third (37.2 percent) demonstrated improvements in behavior. Research from the Global Family Research Project supports the conclusion that sustained participation in afterschool programs can lead to greater academic achievement, improved social skills and self-esteem, decreased behavioral problems, and the development of positive health behaviors.

“Kids are getting experiences that schools like ours don’t have the money to provide,” says Dr. Blakely. “I have kids that walk two and three miles home afterwards because the bus doesn’t stay that late. They do that all winter long—that says a lot about this program.”

And about the meals—I don’t mean to insult anyone’s intelligence, but how much data do you need to prove that proper nutrition is important for learning and development?

From a 2014 report from the Center for Disease Control, titled Health and Academic Achievement: “Hunger due to insufficient food intake is associated with lower grades, higher rates of absenteeism, repeating a grade, and an inability to focus among students.” In addition to academic outcomes, food insecurity negatively correlates with measures of health status, emotional wellbeing, productivity, and behavior among school-aged children. There are scores of studies linking nutritional status with academic performance among youth.

Contrary to common assumptions about who is served by federal assistance programs, these issues don’t just affect students in urban areas like Detroit. Food insecurity affects 16 million children across the United States, and of U.S. counties with high child food insecurity rates, a majority (62 percent) are rural. Stripping funding from 21st CCLC programs will be felt deeply in many underserved communities, among them considerable segments of Trump’s own voter base.

I asked Dr. Blakely about her response to the proposed funding cuts. “It upsets me. It further marginalizes kids that are already marginalized, and it makes a bigger gap between the poor and the wealthy.” She paused. “It makes me angry, too. You already acknowledged that they don’t get food at home—so you know they need it. Why would you stop a program that feeds children?”

What this comes down to, regrettably, is yet another display of the administration complacently setting aside the needs of low- and middle-income families, urban and rural alike, to pursue its own agenda.  Afterschool programs may not work for the president’s budget, but there’s no question that they work for kids.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs