Combined UCS Blogs

For This DC School, Every Month is Farm to School Month

UCS Blog - The Equation (text only) -

Photo: Sarah Reinhardt

It’s the end of October, which means National Farm to School Month is drawing to a close. But that doesn’t matter to the students at School Within School in northeast DC—for them, it’s always farm to school month.

Thanks to a farm to school tour hosted by DC Greens and the National Farm to School Network, I was lucky enough to visit a handful of the cutest (and smartest) gardeners in the district as they cooked up some ratatouille with their fall harvest. At School Within School, kids from three years old through fifth grade get to participate in FoodPrints, a gardening, cooking, and nutrition education program that integrates science, math, and social studies into hands-on lessons about local food.

You may have heard that farm-to-school programs support the economy (they generate an additional $0.60 to $2.16 in economic activity for every dollar schools spend on local foods), benefit public health (they help kids choose healthier options and eat more fruits and vegetables in school and at home), and foster community engagement (they fuel interest in local foods and offer opportunities to combat racial and economic inequities), but if you’re like me, you may have learned most of this from behind a computer screen. It’s another thing entirely to see farm-to-school programs in action, and to hear firsthand about what they could accomplish for our kids and communities with the right funding and support.

Walking in the footsteps of FoodPrints

Around the corner from School Within School, Ludlow-Taylor Elementary also boasts a beautiful school garden. Photo: Sarah Reinhardt

Our tour, led by FRESHFARM director of education Jenn Mampara, kicked off with a quick stop at the chicken coops and then took us to the school garden, where the kids go for lessons once a week. The gardener plants summer crops in August, and when school starts, kids get to weed, water, harvest, replant, and repeat through late spring. During the summer, the garden soil is kept healthy with a rotation of cover crops and beans, which are then dried and used in the fall.

From the garden, we headed up to the teaching kitchen, where students were busy mixing together beans and onions (“It’s watering my eyes!”) from the garden. Produce from the garden is supplemented by local farmers market produce to provide all the ingredients for the monthly cooking lesson that each FoodPrints student attends. The lesson on ratatouille moved fluidly from math (“What will happen if I add one cup of water?”) to science (“Why do we need to soak the beans?”) and back again, and students were engaged in active learning every step of the way.

Watching the educator walk the kids through their recipe, it struck me as wholly unsurprising that studies have shown that kids participating in farm to school programs display greater overall academic achievement, as well as social and emotional growth. Needless to say, kids who participate in farm to school programs also tend to show increased knowledge about gardening, agriculture, and healthy eating.

“It’s a meaningful experience for these kids to have in elementary school,” Mampara said. “This will have a lasting impact on their understanding of good food and where it comes from.”

And speaking of good food—the cooking doesn’t stop in the teaching kitchen. Once a week, the school cafeteria borrows a recipe from FoodPrints, so that kids continue to connect their experience in the garden to the food on their plate. Kristen Rowe, the Nutrition and Compliance Specialist at DCP Public Schools (DCPS) said that when students are involved in the entire process, they’re more willing to try foods like fruits and vegetables. “This initiative has created an appreciation and a connection between our students and nutritious food, and it’s evident in our cafeterias on FoodPrint days!”

Farm to school funding is in high demand

Students use foods from the school garden and local farmers markets in the teaching kitchen at School Within School. Photo: Sarah Reinhardt

But the success of farm to school programs like FoodPrints, which currently operates in 10 DC schools, can come at a price. Rob Jaber, Director of Food and Nutrition Services at DCPS, said he would like to expand the program to serve all DC students, and to do that, he needs resources. Since the “heat and serve” model became a staple of school food service, many schools lack the equipment and kitchen skills needed to start making food from scratch again.

Jaber hopes that DCPS will soon be the recipient of a USDA Farm to School grant, one of the most sought-after funding sources for districts looking to adopt or expand food-based curriculum. The USDA Farm to School Grant Program, established in 2010, provides $5 million annually to fund training, planning, equipment, gardens, education, and other operational costs for farm to school programs nationwide. While that may seem like a lot, it meets only a fraction of the need demonstrated by schools. To date, 365 grants totaling $25 million have been awarded out of more than 1,600 applications requesting more than $120 million. This means that, on average, only one in five applications receives funding.

DC Central Kitchen, a community kitchen and job training program providing meals to 12 schools in the district, was awarded a grant back in 2012. Theresa Myers, DC Central Kitchen’s Foundation and Government Relations Manager, explained how their food service capacity flourished with the grant. The organization received $100,000 to purchase new equipment and hire additional staff, and increased their processing and storage capacity by nearly a third as a result. DC Central Kitchen now purchases over $350,000 in local foods from 30 regional farmers each year, which means that about half of every tray of food served in these schools is local.

What’s happening in DC Public Schools is a microcosm, explained Maximilian Merrill, National Farm to School Network Policy Director. “This is a great model of what’s going on across the country.”

Photo: Sarah Reinhardt

A farm bill for farm to school

Not far from the garden, senators Patrick Leahy (D-VT) and Thad Cochran (R-MS) and representatives Marcia Fudge (D-OH) and Jeff Fortenberry (R-NE) are also thinking about how to support successful farm to school programs around the country. On September 6th, they introduced the Farm to School Act of 2017, which would increase annual funding for the USDA Farm to School Grant Program from $5 million to $15 million; make the grants more accessible to a broader range of childcare settings and populations, including early child care, summer food service, after school programs, and tribal schools; and help beginning, veteran, and socially disadvantaged farmers and ranchers sell more of their produce through farm to school programs.

While National Farm to School Month is almost over (until next year), the farm bill is just getting started. To show your support for farm to school programs, you can sign on to this letter of support written by the National Farm to School Network endorsing the Farm to School Act of 2017. (You can also sign on behalf of an organization.)

Increasing the funding available for programs like FoodPrints by threefold means triple the opportunities for education and engagement, triple the economic benefit, and triple the happy and healthy kids. If that doesn’t water your eyes, I don’t know what will.

 

Spectrum of Harm: Ripple Effects of Trump’s Macabre Environmental Policies

UCS Blog - The Equation (text only) -

Photo: Yvette Arellano/TEJAS

The front pages of last Sunday’s Washington Post and New York Times starkly exposed how concerned citizens, fearful immigrants, and career scientists alike are smothered by the Trump administration’s macabre environmental policies.

Worming through the EPA

The Times zeroed in on Nancy Beck’s voracious worming through the Environmental Protection Agency’s regulations of dangerous chemicals and poisons. She is figuratively trying to eat enough holes in the rules to make chemicals harder to track and control and therefore shielding polluters from prosecution.

Beck is the former regulatory science policy director of the American Chemistry Council and was appointed by President Trump in May as deputy assistant secretary in the EPA’s department of Chemical Safety and Pollution Protection. Before the American Chemistry Council, she served in the George W. Bush White House, where she badgered the EPA in such a picayune manner for proof of chemical harms that she was criticized by the nonpartisan National Academy of Sciences.

During President Obama’s tenure, Beck performed the same function for the nation’s leading chemical lobbyist, questioning regulation on arsenic and other chemicals used in perfumes and dry cleaning. But Trump’s election turned the world upside down. Beck was brought back by his White House under special provisions that exempted her from ethics rules that would have prevented her from being involved in decisions involving former employers.

She has since wasted no time declaring herself a puppet of industry instead of a searchlight for safety.

Weakened rules on a kidney cancer-causing chemical and other harmful toxins

The Times highlighted Beck’s heavy hand in weakening rules on perfluorooctanoic acid (PFOA), a kidney cancer-causing agent that many large companies, including BASF, 3M, and DuPont, volunteered to phase out during the Obama administration. PFOA is present in such common items as nonstick kitchenware and stain-resistant carpeting.

But even with a total phase out, that chemical remains in millions of cabinets and on millions of floors around the nation. Beck’s rewriting of rules made it seem that the EPA would no longer make risk assessments on “legacy” use of products containing PFOA or their storage or disposal. That so alarmed staffers at the EPA’s Office of Water that they wrote a memo, obtained by the Times,  warning that PFOA’s potential to continue to pollute drinking water and ground water remains so strong that it “is an excellent example of why it is important to evaluate all conditions of use of the chemical.”

PFOA is only one of several chemicals that the Trump administration and EPA Administrator Scott Pruitt, with Beck as a relatively little-known henchwoman, are limiting scrutiny on to protect industry. Before Beck arrived, the Trump EPA had already refused—over the objection of staff scientists—to ban chlorpyrifos, a pesticide believed to stunt child development. The agency, as the Times reports, is also re-considering proposed bans of methylene chloride in paint strippers and trichloroethylene, which respectively are used in paint strippers and dry cleaning and are linked to illness and death.

A relentless assault

This relentless assault is remaking the EPA into the Everyday Pollution Agency and has reached such a pitch under President Trump that Beck’s immediate boss, Wendy Cleland-Hamnett, left the agency last month after 38 years with the agency. Hamnett supported the ban on chlorpyrifos and had a long track record of elevating public health impacts into her consideration of chemical harms, particularly on lead paint in homes. As she told the Times, science can rarely be 100 percent sure about anything but if a chemical is “likely to be a severe effect and result in a significant number of people exposed . . . I am going to err on the side of safety.”

With a White House that now errs on the side of industry, Hamnett told the Times that she resigned because, “I had become irrelevant.”

“You can’t let your windows up and enjoy a fresh breeze”

A Trump administration dedicated to making science and scientists irrelevant surely has worse in store for everyone else. Last year, the Union of Concerned Scientists and the Texas Environmental Justice Advocacy Services released a report revealing how low-income residents and people of color regardless of income are more likely to live near toxic chemical facilities in the Houston area.

The Post’s story on Corpus Christi shows that the worst is happening already. For decades, the predominately African American and Latino community of Hillcrest has abutted a massive oil refinery complex that includes a gasoline production facility for Citgo and a Koch brothers plant making jet fuel for the Dallas/Fort Worth International Airport.

One resident, 56-year-old Rosie Ann Porter, retired from a job supplying helicopter parts, told the Post that her daughter grew up with serious asthma. Other neighbors complained of other chronic lung diseases. Porter said,  “You can’t let your windows up and enjoy a fresh breeze coming through the house. When they’re up and the refinery’s spilling out those fumes, it’s nothing nice.”

In a solid example of environmental justice reporting that displayed the agency of residents, the Post made it clear that the residents refused to succumb to the fumes without many fights. But victories were fleeting. A federal jury found Citgo guilty in 2007 of spewing benzine, a known carcinogen, into the community. The company was fined $2 million, but the verdict and fine were completely overturned on appeals, based on improper instructions to the jury. A report last year by the Department of Health and Human Services found higher rates of asthma and cancer in males than the Texas average.

Living in the shadows of soot

More recently, Hillcrest rose up against a massive proposed $500 million bridge spanning high above the Corpus Christi shipping channel. The bridge would allow supertankers to ply beneath it, but construction of the span and a highway addition would completely box in and isolate Hillcrest from any other neighborhood.

Citing civil rights laws and banking on support from an Obama administration sympathetic to the history of highway projects ripping apart communities of color, Porter and other Hillcrest residents filed a complaint with the Federal Highway Administration. The complaint resulted in an unusual compromise in 2015. Texas officials were so eager to increase commerce that they agreed to buyout residents at two or three times their average home value of around $50,000.

That victory came with some major asterisks. One is that the buyouts still may never fully compensate residents for home values that were depressed for decades because of the encroaching refineries. Another is that undocumented residents, whom the Obama administration assumed were eligible for buyouts under nondiscrimination laws, were cut out of the deal by Texas once the Trump administration took over with its anti-Latino immigrant animus. No one knows how many undocumented families are affected because a public complaint might result in deportation.

Thus one set of people, after decades of industrial abuse, are about to set off for other parts of Texas with a payout that may or may not help them buy new homes. Another set of people will continue to live fearfully in the shadows of soot. And in Washington, an administration continues to draw a shroud over all of environmental protections, working to the day that science and safety are, to borrow from Wendy Cleland-Hamnett, irrelevant.

This Is Your Planet on Sea Level Rise. Any Questions?

UCS Blog - The Equation (text only) -

This is the extent of flooding from Hurricane Sandy in Cape May, NJ (left) vs. the area that would flood twice monthly by 2100 due to sea level rise (right)

One of the most powerful televised public service announcements of my youth inspired this post.

There are moments when your own data stops you dead in your tracks. I had one of those moments a few months ago as we were preparing to release our When Rising Seas Hit Home report.

The results were so stark, the case for sound climate policy so clear that I can think of no better way, on the eve of the fifth anniversary of Hurricane Sandy’s devastating landfall, to convey where sea level rise could take us than by spoofing what was arguably the most powerful televised public service announcement of my youth.

Is there anyone out there who still isn’t clear about what sea level rise does? OK. Last time.*

This was the extent of flooding from Hurricane Sandy on Long Island

This is the area that would flood twice monthly by 2100 due to sea level rise

Any questions?

This is the extent of flooding from Hurricane Sandy in northern New Jersey (left) vs. the area that would flood twice monthly by 2100 due to sea level rise (right)

Any questions?

This is the extent of flooding from Hurricane Sandy in Cape May (left) vs. the area that would flood twice monthly by 2100 due to sea level rise (right)

Any questions?

*Yes, this was the actual language used in the PSA.

Data sources: UCS When Rising Seas Hit Home; FEMA Hurricane Sandy Impact Analysis; OpenStreetMap; Partnership for a Drug-Free America

Kristy Dahl

#Sandy5: Will the Nation Act on Climate Change Reality?

UCS Blog - The Equation (text only) -

Aerial views of the damage caused by Hurricane Sandy to the New Jersey coast taken during a search and rescue mission by 1-150 Assault Helicopter Battalion, New Jersey Army National Guard, Oct. 30, 2012.  (U.S. Air Force photo/Master Sgt. Mark C. Olsen)

The 29th of October marks the 5-year anniversary of when Hurricane Sandy first made landfall on the mid-Atlantic coast of the U.S. It comes at a time when Americans are reeling from the unprecedented hurricane season that devastated communities in Florida, Texas, Puerto Rico and the US Virgin Islands. Lives were lost, homes destroyed, schools and hospitals among other essential services were interrupted, and energy, transportation and water systems and other infrastructure were fractured. Associated health challenges can be life threatening and have lingering mental health impacts. The ability of homeowners to handle the economic damages is in question given the level of destruction to homes and low take-up rate of flood insurance.  Many homeowners impacted by these hurricanes are falling behind on their mortgages.

Hurricane Harvey may have exposed to flooding more than 160 of EPA’s Toxic Release Inventory sites, 7 Superfund sites, and 30 facilities registered with EPA’s Risk Management Program.

The 2017 hurricane season recovery challenges will sound familiar to those communities who are remembering the devastation Hurricane Sandy wrought.  At that time, multiple weather systems collided and hit one of the most populated places in the mid-Atlantic region.  While the Obama Administration declared Federal disasters in 12 states and the District of Columbia, New Jersey and New York felt the brunt of destruction. Communities lost 159 lives, had 650,000 homes damaged or destroyed and thousands of businesses were forced to close.

The climate change fingerprint on Hurricane Sandy

Before Hurricane Sandy made landfall near Brigantine, NJ, it formed over the Caribbean where it developed into a Category 1 hurricane on October 23, 2012 followed by landfall the subsequent days near Kingston, Jamaica and then the southeastern part of Cuba and Haiti as a Category 2 hurricane. Experts spoke to the climate connection, here, here, and here, and since then we have gained even more ground on the climate change fingerprint on Hurricane Sandy.

Scientists estimate that without climate change driven sea level rise, the footprint of Sandy would have been at least 10% less than observed in New York City, equivalent to $2 billion dollars of damages and an additional 11.4% people affected and 11.6% more housing units flooded.

Just this week, Proceedings of the National Academy of Sciences published new science (reported on here) that finds flooding in New York City due to tropical cyclones will be more frequent and the flood heights will be higher because of rising sea levels. Using a worst-case scenario for sea level rise which combines the Intergovernmental Panel on Climate Change high-emissions scenario along with newer research on accelerated melting of Antarctic ice sheets, they find that sea level rise in New York City could reach from 3 to 8 feet by 2100 and “500-year” flood events could happen every five years.

UCS’s own analysis When Rising Seas Hit Home: Hard Choices Ahead for Hundreds of US Coastal Communities looks at the entire coastline of the lower 48 states and identifies communities that will experience flooding so extensive and disruptive that it will require either expensive investments to fortify against rising seas or residents and businesses to prepare to abandon areas they call home.  We found that the extent of flooding associated with the storm surge event from Hurricane Sandy in 2100 under a high emissions scenario, would happen every other week. We also found that hundreds of communities could avoid this type of chronic inundation if we keep future warming below 2°C.  In New York curtailing future warming and sea level rise could spare two New York communities from chronic inundation by 2060 and three to 13 communities by the end of the century—including four boroughs of New York City.

Community recovery

Five years later what does recovery look like?  NJ Resources Project released The Long Road Home, a report that compiles stories by Sandy survivors, facts on the recovery process based on surveying 500 survivors, and local, state and federal recommendations.  It’s a sober reminder of the long road ahead the communities impacted by Hurricanes Irma, Harvey and Maria face when it comes to recovery.

In a step in the right direction, this week NJ Governor Christie announced an additional $75 million for their Blue Acres buyout program for flooded homes to help move people to less risky areas. Buyouts are widely recognized as an essential flood risk resilience tool, but research postSandy has found that improvements can and need to  be made to ensure people truly are better off after the buyout. Additional research indicates we still have a lot to learn on how to do buyouts well, and hopefully the proposal the Federal Emergency Management Agency (FEMA) is reportedly considering will take these recommendations into account.

Community mobilization

Communities hit by Sandy know these realities all too well and they’re mobilizing and calling for action. A few examples include:

  • In NYC communities are mobilizing around #Sandy5 and will march in NYC on October 28 to demand action by their elected officials.
  • In New Jersey:
    • NJ Future convened the Shore of the Future symposium to underscore the need for a regional approach to coastal resilience as well as state actions the new governor can take in the face of climate change
    • Floodplain managers are convening for their 13th annual conference around changes under the Trump Administration to federal agencies and policies as well as the change that will come with the upcoming election of their new governor.

These efforts are a call to action at all levels government, but particularly the federal government. So what’s this Administration doing?

As warnings on climate change risks soar, the Trump administration’s actions place the nation in peril

The Sandy anniversary comes at a time when we not only could not imagine the 2017 hurricane season,  but we also could not have imagined the vast degree of President Trump’s roll backs on climate change policy including pulling the U.S out of the Paris Accord and repealing the Clean Power Plan.

In the first six months of this administration, UCS tracked the vast efforts to sideline science including stacking heads of federal agencies with climate deniers and appointing conflicted individuals to scientific leadership positions. One can argue that the Department of Defense is the last department where federal employees can speak to climate change.

Last month the watchdog for the federal government, the Government Accountability Office (GAO) released their “high risk” report calling on the need for government-wide action is needed to reduce fiscal exposure to climate change by better managing climate change risks.  This week, GAO released a report on the economic costs of climate change and finds that without climate change in check, costs could be as high as $35 billion per year by mid-century while Zillow’s recent analysis finds that nearly 2 million U.S. homes could be underwater in 80 years if oceans rise 6 feet or higher by 2100.    GAO “high risk” 2017 report includes broad strategic recommendations as well as specific actions to:

  • Protect federal property and resources by federal agencies’ consistently implementing the Federal Flood Risk Management Standard (as well as other measures);
  • Under the federal flood and crop insurance programs, build climate resilience into the requirements for federal crop and flood insurance programs;
    • President Trump: released a piecemeal reform proposal to reauthorizing the National Flood Insurance Program.
  • Provide technical assistance to federal, state, local, and private-sector decision makers
    • President Trump: signed an executive order  revoking executive orders that helped prepare the nation for climate change; encourage private investment in reducing pollution; and ensure our national security plans consider climate change impacts.
  • Under disaster aid, adequately budget and forecast procedures to account for the costs of disasters.
Here’s what an effective national disaster response looks like

In recognition of the size and magnitude of Hurricane Sandy, in just over a month, President Obama signed an Executive Order establishing the Hurricane Sandy Rebuilding Task Force. Led by Sec. of HUD, Secretary Donovan, the Hurricane Sandy Rebuilding Task Force brought together 23 federal agencies and local and state leaders from NY, NJ, CT MD, RI, and the Shinnecock Indian Nation. At the foundation of the task force report was the use of better science and technology to inform decisions in rebuilding efforts and recovery efforts. The task force recommendations included:

  • Promote resilient rebuilding based on current and future risk and through innovative ideas
  • Ensure a regionally coordinated, resilient approach to infrastructure investment
  • Providing affordable housing and protecting homeowners
  • Supporting small businesses and revitalizing local economies
  • Addressing insurance challenges, understanding and accessibility
  • Building local governments’ capacity to plan for long-term rebuilding and preparation for future disasters

Why is this approach a model response for federally declared disasters?  It’s a model that recognized a national response must bring the “whole of government” and communities together, including those who have been historically marginalized. At the core was the understanding that we must plan for the future based on the latest climate science and future conditions. The Task Force also fostered creative, comprehensive, innovative strategies to increase community resilience bringing in all sectors and types of infrastructure. The Task Force also understood that keeping small businesses’ lights on and building the capacity of local governments to plan for long-term rebuilding were both vital to sustaining local economies.

Congressional actions needed

This week the Senate approved $36.5 billion in disaster assistance which comes after a $15 billion disaster assistance package after Harvey and includes $18.7 billion to build up the Federal Emergency Management Agency’s main Disaster Relief Fund, $16 billion to the National Flood Insurance Program as well as $1.2 billion for nutrition assistance and wildfire funding.

In contrast, after Sandy, in December of 2012, the administration submitted a $60.4 billion request for supplemental funding for disaster assistance and recovery.  In 2013 Congress approved Disaster Relief Appropriations Act of 2013 or the “Sandy relief bill” at $50.5 billion (as well as an additional $9.7 billion in new borrowing authority to NFIP).

As total cost estimates of damages are still coming in, we know the Congressional disaster relief packages to date are just a drop in the bucket. Towns, cities, and states cannot be left on their own to face the costs of increasing their communities’ resilience in the face of climate change.  After the battering one-two-three punch of Hurricanes Irma, Harvey, and Maria and on the 5-year anniversary of Hurricane Sandy, Congress must come together on bipartisan actions in the near-term to:

  • Defend science by protecting federal agency budgets;
  • Pass legislation to reinstate the flood risk management standard to ensure federally funded infrastructure is “flood-ready”;
  • Reauthorize the National Flood Insurance Program to ensure risk based flood insurance rates as well as affordability of flood insurance, science informed flood maps that reflect risk and future conditions including climate change driven sea level rise, consumer protections, and robust pre-disaster mitigation measures that invest in innovative strategies to reduce risk through buy-out programs and nature-based solutions; and
  • Authorize the Department of Defense to invest in understanding and mitigating the risks climate change pose to our national security and our military installations and surrounding communities.

Will Congress take bipartisan action to pass these policies?

The jury is out but the sad reality is that it still won’t be sufficient to address the water that will come. In fact, as we make the case here, we must have a comprehensive strategy to:

  • phase out policies that encourage risky coastal activities;
  • bolster existing policies and funding;
  • and put forth bolder, comprehensive solutions to help communities retreat from risky areas.

 

 

 

New Jersey Resource Project

Science and Democracy Engages the Science of Democracy: The Kendall Voting Rights Fellowship

UCS Blog - The Equation (text only) -

Photo: Peter Dutton/CC BY-NC-SA 2.0 (Flickr)

This fall, I am excited to help launch a new chapter in the Union of Concerned Scientists’ commitment to putting science to work toward building a healthier planet and a safer world. My research training is in the field of electoral systems and their impact on representation and public policy. I am most recently a co-author on the book Gerrymandering in America: The Supreme Court, the House of Representatives and The Future of Popular Sovereignty. As the new Kendall Voting Rights Fellow, I will be studying the impact of elections on many of the broader policy goals that UCS is pursuing.

In this context, a healthier and safer world means an open, resilient electoral system that can fairly and accurately convert aggregated preferences into policy choices. There are a number of opportunities where UCS can make an important contribution to improving voting rights and electoral institutions.

Fighting voter fraud and voter suppression

Perhaps no area in the field of voting rights has received more attention than President Trump’s claim that millions of people voted illegally to deny him a popular vote majority. The entire event serves as a painful reminder of how scientists can lose control over how their research is used. While we can say with absolute certainty that the president’s claim is false, there is no scientific consensus on how to best estimate levels of voter fraud.

We know that voter fraud occurs, but techniques for accurately estimating fraudulent votes cast, which lie somewhere between allegations and convictions, are difficult (but see discussions here and here). UCS will be working to provide the public and policy makers with the most accurate science available, and we will work with partner organizations to build safeguards that assure the integrity of voting for all who are eligible, especially those who are at risk of being disenfranchised through overly restrictive eligibility and voting requirements.

Similarly, there is a great deal of anecdotal information about the negative impact of voter suppression tactics. Previous scientific studies have shown that laws like voter identification have a negative impact on minority groups, but more recent work has been critiqued on methodological grounds. UCS will work to identify the participatory consequences of administrative laws ranging from early registration deadlines to online and automatic registration, early voting, and ballot access laws.

Improving policy outcomes through electoral integrity

When voting rights are compromised, it is typically because policy makers are attempting to insulate themselves from electoral pressure on critical policy issues. As we have seen in areas as diverse as climate policy, transportation, food systems and even global security, when policy makers are shielded from electoral accountability, they are more susceptible to the influence of powerful, narrow interests.

Scientists are in a unique position to bring policy expertise to questions regarding the consequences of restrictive election laws. In addition to assessing the extent to which our current electoral methods, including administrative law, electoral districting, and the Electoral College, create opportunities and threats to electoral integrity and security, we will work to identify how these methods impact legislative policymaking and social outcomes.

UCS already plays a crucial role in educating the public and combating environmental racism and public health risks. However, there is a gap in research exploring the link between electoral gamesmanship and the environmental and health injustices that afflict the most vulnerable communities. UCS can draw on expertise across numerous fields to help fill this gap.

In addition, analyses and research products on electoral integrity that UCS can provide will allow us to strengthen existing partnerships with communities dedicated to the advancement of environmental justice, political equality and human rights. From local communities to the U.S. Capitol, we will work with organizations to improve electoral integrity though the adoption of open and secure election laws.

Developing a reform agenda

An area that should be of particular interest to the UCS community and members of the Science Network is election information security and technology, a field where computer scientists have as much to say as political scientists. There is widespread agreement that not only are many of the nation’s voting machines outdated and vulnerable to hacking, but that cyber-attacks on election software and records will play an ever-increasing roll as a threat to electoral integrity.

Moreover, there is increasing evidence that the very structure of systems such as the Electoral College create security threats by focusing the attention of hackers and disinformation campaigns on a small number of states that can swing a presidential election. UCS will partner with advocacy organizations to analyze threats and innovations in election security, and advocate for evidence-based policies to address these threats.

And as this month’s landmark gerrymandering case before the Supreme Court also made clear, scientists are playing a major role in providing recommendations about how to determine racial and partisan discrimination in districting plans, and studying the consequences of proposed legislative remedies. In addition to identifying causes and remedies for electoral discrimination, UCS experts can help fill the gap of understanding how such discrimination affects environmental, health and related policies, which tend to negatively impact specific populations.

Rigorous analysis of the impact of electoral integrity on policy, and the ways that electoral discrimination impacts our quality of life, will provide critical support needed for reform. The challenges are clear, but so is the mission: to understand and engineer the democratic institutions that we need to build a healthier planet and a safer world.

I Am a 30-Year Veteran Scientist from US EPA; I Can’t Afford to Be Discouraged

UCS Blog - The Equation (text only) -

. . . And neither can you.

Since January, we have seen a continual assault on our environmental protections. EPA has put a political operative with no scientific experience in charge of vetting EPA grants, and the agency is reconsidering an Obama-era regulation on coal ash. The well-established legal processes for promulgating environmental regulations, and—very pointedly—the science underlying environmental regulation are being jettisoned by the Trump administration. As scientists, we must stand up for science and ensure that it is not tossed aside in public policy and decision-making.

Rigorous science is the foundation of EPA

Attending a march with some friends.

While at US EPA, I served as a senior scientist in human health risk assessment.  I was among the cadre of dedicated professionals who worked long, hard, and intelligently to provide the science supporting management of risks from exposure to environmental contaminants. Often, we engaged in the demanding practice of issuing regulation.

Regulations to limit human and environmental exposure are not developed overnight.  The laws that enable US EPA to issue regulations specify requirements and procedures for issuing rules; these can include notice of proposed rulemaking, multiple proposed rules, public comments on proposals, responses to comments, more proposals, more comments, review by other Federal bodies, review by States, review by Tribal governments—review, review, review. Often, the environmental laws also note requirements for the science leading to risk management choices. For example, the Safe Drinking Water Act of 1996 (SDWA) requires several judgments to be met affirmatively before any contaminant can be limited through regulation.

The US EPA Administrator must base his or her judgment, among other factors, on what SDWA calls the best available, peer-reviewed science.  This refers not only to experimental or epidemiologic studies, but also to the US EPA documents analyzing the risks and the best ways to mitigate them.

Requirements to regulate environmental contaminants in other media are no less rigorous.  To regulate emissions from coal- and oil-fired boilers used in electrical power generation, US EPA engaged in major scientific programs to understand the nature of these air pollutants (including toxic mercury), the risks they pose, and how best to deal with them. This began in 1993 and culminated in the Mercury and Air Toxics Standards (MATS) finalized in 2012. Building the scientific basis for the rule spanned several administrations and a few careers.  It was frustrating at times, and exhausting, but we kept our focus on the goal of doing the right thing to improve public health.

Regulation protects the public—and we’re watching it be undermined

The message here is that environmental regulation based on sound science is not a trivial exercise, nor should it be. Regulation can be costly, and sometimes may have societal impacts. But ask anyone who has lived in a society without sound environmental regulation, and she will tell you that legally enforceable limits on environmental contaminants are necessary. We estimated that each year the implemented MATS rule prevents 11,000 premature deaths and more than 100,000 heart and asthma attacks. And it greatly reduces release of mercury, which accumulates in fish and poses risk of neurotoxic effects to both developing children and adults.

The process that EPA follows to publish a regulation must also be used to reverse a regulatory action. Creating regulations is not a simple process—but undermining, overturning, and not enforcing regulations is easy and has major consequences for health and the environment. I fear that both the process and the science are being given short shrift as this administration acts to reverse sound regulatory decisions made by US EPA. This dismantling of environmental protection has begun in earnest, and I expect it will have severe, long-lasting effects.

Scientists must defend evidence-based regulation

There are ways to impede the regulatory roll-back. Writing, calling, emailing elected officials is one avenue. Another avenue is joining groups such as Save EPA, an organization of retired and former US EPA employees with expertise in environmental science, law, and policy. We are using our collective skills to educate the public about environmental science, environmental protections, and the current Administration’s assault on US EPA and our public health. You can help by reading our guide to resisting de-regulation; submitting public comments on rules being considered for rollback; and supporting our efforts to defend environmental regulations. As scientists, we must continue to insist on the validity and thoroughness of our discipline, and we must repeatedly communicate about this to decision-makers. In one of many hearings and reviews of mercury hazard, my late scientist friend and US EPA veteran Kathryn Mahaffey quoted John Adams: “Facts are stubborn things.” She was right.

Rita Schoeny retired from USEPA in 2015 after 30 years, having served in roles such as Senior Science Advisor for the Office of Science Policy, Office of Research and Development, and as the Senior Science Advisor, Office of Science and Technology, Office of Water. She has been responsible for major assessments and programs in support of the several of EPA’s legislative mandates including the Safe Drinking Water Act, Clean Water Act, Clean Air Act, and Food Quality Protection Act. Dr. Schoeny has published extensively in the area of human health risk assessment.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Update: Turkey Point Fire and Explosion

UCS Blog - All Things Nuclear (text only) -

An earlier commentary described how workers installing a fire retardant wrap around electrical cables inside Switchgear Room 3A at the Turkey Point nuclear plant in Florida inadvertently triggered an explosion and fire that blew open the fire door between the room and adjacent Switchgear Room 3B.

I submitted a request under the Freedom of Information Act (FOIA) for all pictures and videos obtained by the special inspection team dispatched by the NRC to Turkey Point to investigate this event. The NRC provide me 70 color pictures in response to my request. This post updates the earlier commentary with some of those pictures.

The workers installing the fire retardant wrap cut the material in the hallway outside the switchgear rooms, but trimmed the material to fit as they put it in place. The trimming process created small carbon pieces. Ventilation fans blowing air within the switchgear room carried the carbon fiber debris around. The picture taken inside Switchgear Room 3A after the event show some of the carbon fiber debris on the floor along with debris caused by the fire and explosion (Fig. 1).

Fig. 1 (Source: Nuclear Regulatory Commission)

Some of the carbon fiber debris found its way inside metal panels containing energized electrical equipment. The debris created a pathway for electrical current to arc to nearby metal bolts. The bolts had been installed backwards, resulting in their ends being a little closer to energized electrical lines than intended. The electrical current was 4,160 volts, so it was quite a powerful spark as it arced to an undesired location (Fig. 2).

Fig. 2 (Source: Nuclear Regulatory Commission)

Law enforcement officers sometimes use Tasers to subdue a suspect. Taser guns fire two dart-like electrodes into the body to deliver an electric shock that momentarily incapacitates a person. The nuclear Taser at Turkey Point triggered an explosion and fire. The picture shows damage to a metal panel from the High Energy Arc Fault (HEAF) (Fig. 3).

Fig. 3 (Source: Nuclear Regulatory Commission)

Fortunately, there was not much combustible material within the switchgear room to sustain a fire for long. Fig. 4 shows some of the fire and smoke damage inside the switchgear room.

Fig 4 (Source: Nuclear Regulatory Commission)

The primary consequence from the explosion and fire in Switchgear Room 3A was damage to Fire Door 070-3 to adjacent Switchgear Room 3B. The Unit 3 reactor at Turkey Point has two switchgear rooms containing power supplies and controls for plant equipment. The fire door’s function is to prevent a fire in either room from affecting equipment in the adjacent room to minimize the loss of equipment (Fig. 5).

Fig. 5 (Source: Nuclear Regulatory Commission)

The metal fire door had a three-hour rating, meaning it was designed to remain intact even when exposed to the heat from a fire lasting up to three hours. The plant’s design assumed that a fire would be extinguished within that time. The plant’s design had also considered the forces caused by a HEAF event, but only looked at components within three feet of the arc. The fire door was more than 14 feet from the arc, but apparently was not aware of the 3-feet assumption (Fig. 6).

Fig. 6 (Source: Nuclear Regulatory Commission)

The force of the explosion pressed so hard against the fire door that it broke the latch and popped the door wide open. The fire door was more than 14 feet from the arc (even farther away after the explosion), but apparently was not aware of the 3-feet assumption (Fig. 7).

Fig 7 (Source: Nuclear Regulatory Commission)

I don’t have a picture of the fire door and its latch pre-explosion, but this closeup of the door’s latching mechanism suggests the magnitude of the force applied to popping it open. This picture also suggests the need to go back and revisit the 3-feet rule (Fig. 8).

Fig. 8 (Source: Nuclear Regulatory Commission)

The explosion and fire triggered the automatic shutdown of the Unit 3 reactor. The Shift Manager declared an Alert, the least serious of the NRC’s four emergency classifications, due to the explosion and fire affecting equipment within Switchgear Room 3A. Workers called the local fire department for assistance with the fire and a worker injured by the explosion. This picture of the operations log noted some of the major events during the first 90 minutes of the event (Fig. 9).

Fig. 9 (Source: Nuclear Regulatory Commission)

UCS Perspective

The earlier commentary explained that two minor events occurred the month before the explosion and fire. In each of those events, carbon fiber debris from workers trimming material inside the switchgear room landed on electrical breakers and caused them to open unexpectedly and unwanted. But those warnings were ignored and the practice continued until a more serious event occurred.

This HEAF event is also a warning. It failed a barrier installed to prevent an event in one switchgear room from affecting equipment in the adjacent room. It had been assumed that a HEAF event could only affect components within 3 feet, yet the damaged door was more than 14 feet away. If the assumption now shown to be patently false does not lead to re-evaluations and necessary upgrades, shame on the nuclear industry and the NRC for not heeding this very clear, unambiguous warning.

How the NFL Sidelined Science—and Why It Matters

UCS Blog - The Equation (text only) -

Photo: Nathan Rupert/CC BY-NC-ND 2.0 (Flickr)

Football was not just the most important social activity on weekends in New Jersey growing up, but it was woven into the family and community in which I grew up. My dad played football in his small town Vermont high school along with his older brother who went on to play college football at the University of Vermont. Hence, weekends at the Reed household were for screaming at TV sets or from real-life bleachers and theatrical displays of cheering played out in falling off of couches and crashing onto floors.

Football players have always carried a sort of badge of honor for playing America’s favorite sport, but it wasn’t until recently that that badge began to carry even more weight due to emerging knowledge about what even just a few years of executing football plays could mean for their quality of life down the line.

Even if you don’t closely follow football, you are likely aware that the NFL has been at the center of the news cycle in recent months, with players kneeling during the playing of the national anthem to protest racial injustice and police brutality. The players’ protest has drawn fire from a number of directions, including the White House. (Here at UCS, our staff joined with the campaign #scientiststakeaknee, supporting these players’ right to protest and the importance of their cause.)

But these protests aren’t the only way that the NFL has come into the spotlight. It’s increasingly clear that the repeated head injuries many football players experience can cause long-term damage—but the NFL has worked hard to bury these facts.

The NFL’s foray into CTE science

A powerful slide from Dr. Ann McKee’s presentation summarizing her findings on CTE at the Powering Precision Health Summit. The BU Brain Bank most recently analyzed 111 brains of former NFL players, finding that all but one had signs of CTE.

The NFL spent years, beginning in the 1990s, working to control the science behind the health consequences of repeated head injuries incurred while playing football. By doing so, the company infringed on its players’ right to know and ability to make informed decisions about their health and career paths. And as the NFL failed to do its due diligence to conduct honest science on the game, players were consistently told to return to play after collisions only to be left with debilitating health issues and devastated family members.

The NFL’s actions closely track with the tobacco and fossil fuel industries, and include examples of just about every tactic in our “Disinformation Playbook,” which are documented in Steve Fainaru and Mark Fainaru-Wada’s 2013 book, League of Denial. Just a few uses of the plays include:

The Fake: The NFL commissioned a Mild Traumatic Brain Injury (MTBI) committee that published a series of studies in the journal Neurosurgery in the early 2000s, which downplayed the risks of repeated head injuries by cherrypicking data and using incomplete data on the number of concussions that were reported during games.

The Blitz: Bennett Omalu, the pathologist who first discovered CTE in an NFL player, faced opposition from the NFL which called for the retraction for his article on the subject in 2005 and then called his second study “not appropriate science” and “purely speculative.” The second chair of NFL’s brain injury committee, Ira Casson, later attacked and mocked Boston University neuropathologist, Dr. Ann McKee, for her work on CTE.

The Diversion: Ira Casson acquired the nickname “Dr. No” by the authors of League of Denial as he willfully refused to accept that repeated head injury could lead to long-term brain damage in football players, even though he spent years studying boxers and had concluded that the sport was associated with brain damage. In a 2010 Congressional hearing on football brain injuries, he held tight to his denial of the link, telling members of Congress that, “My position is that there is not enough valid, reliable or objective scientific evidence at present to determine whether or not repeat head impacts in professional football result in long term brain damage.”

The Fix: The NFL was able to manipulate processes in order to control the science on head injuries sustained while playing football. The editor-in-chief of the journal Neurosurgery in which all of the MTBI’s studies were published was Dr. Michael Apuzzo, a consultant for an NFL football team. The peer review process for this journal, unlike others, allowed papers to be published even if reviewers were harshly critical and rejected the science as long as the objections were published in the commentaries section of the paper. Despite harsh criticism from reviewers who were prominent experts in the field, Dr. Julian Bailes and Dr. Kevin Guskiewicz, the MTBI got away with publishing a series of papers downplaying the health risks of playing football.

In 2016, the NFL finally admitted that there was a link between playing football and the development of degenerative brain disorders like CTE after denying the risks for over a decade. The NFL has since changed some of its rules and has dedicated funding to help make the game safer for players, protections that President Trump argues are “ruining the game.” Trump’s blatant disregard of the evidence on the health impacts of playing football is beyond disappointing but not at all surprising, considering the way that this administration has treated science since day one.

From NFL player to science champion

Chris and I behind the scenes after a full day filming the PSA this August.

I have been fortunate to meet and spend time with former NFL player and science champion, Chris Borland, who has turned his frustration with the league into support for independent science on the impacts of playing football for its child and adult players. Yesterday, he spoke at a scientific conference on the role of the media and others in communicating the CTE science to the general public, so that we all have a better understanding of the risks of playing football, especially during youth. He also spoke about the emerging science on biomarkers that will help diagnose CTE in living players in the near future.

Here’s Chris’s take on why we should be standing up for science and exposing and fighting back against the disinformation playbook:

Chris and I are also featured on this week’s Got Science? podcast. Listen below:

In carrying out plays from the Playbook to sideline science, companies like the NFL break a simple social responsibility to “do no harm.” Take a look at our brand new website detailing the case study of the NFL along with 19 other examples of ways in which companies or trade organizations have manipulated or suppressed science at our expense, and find out how you help us stop the playbook.

Photo: Nathan Rupert/CC BY-NC-ND 2.0 (Flickr)

Electric Vehicles, Batteries, Cobalt, and Rare Earth Metals 

UCS Blog - The Equation (text only) -

Battery Pack for BMW-i3 Electric Vehicle (at Munich Trade-Show Electronica). Photo: RudolfSimon CC-BY-2.0 (Wikimedia)

The case for switching to electric vehicles (EVs) is nearly settled. They are cheaper to use, cut emissions, and offer a whisper quiet ride. One of the last arguments available to the EV-hater club, which is largely comprised of thinly veiled oil-industry front groups funded by the Koch brothers, surrounds the impacts from the materials used to make an EV’s battery pack.

Specifically, the use of lithium, cobalt, nickel, and other metals that are part of an EV lithium-ion battery pack has raised red flags toward the poor human rights and worker protection records in the countries where these materials are mined.

A lot of these warnings have been incorrectly categorized under “EVs and rare earth metals.” Though neither lithium nor cobalt are rare earth metals, and rare earth metals aren’t nearly as rare as precious metals like gold, platinum, and palladium, there are important issues surrounding the production of lithium-ion batteries that must be acknowledged and addressed.

It is also important to note that these impacts are not happening just because of EVs. They are also being driven by the global demand for cell phones, laptop computers, and the multitude of other electronic devices that use lithium-ion batteries.

As EVs gain market share, they will be more responsible for the impacts from battery production. But today, EVs comprise a small fraction of global vehicle sales. So, concerns about lithium-ion batteries should be directed not just to the suppliers of EV battery packs, but also toward Apple, Samsung, and the other companies that source lithium-ion batteries for their electronic goods.

Let’s also not forget that the supply chain for gasoline-powered vehicles has its fair share of issues, ranging from human rights violations like the use of child labor, to disastrous oil spills like Deepwater Horizon. But unlike gasoline-powered vehicles, EVs will be able to take advantage of emerging battery chemistries that don’t rely on cobalt or other materials that are linked to exploitative practices.

Cobalt and electric vehicle batteries

Cobalt, a bluish-gray metal found in the Earth’s crust, is one of today’s preferred components used to make the lithium-ion batteries that power laptops, cell phones, and EVs.  Cobalt is mined all over the world, but 50 to 60 percent of the global supply comes from the Democratic Republic of Congo (DRC), which has a poor human rights track record. According to UNICEF and Amnesty International, around 40,000 children are involved in cobalt mining in DRC where they make only $1 – $2 USD per day. DRC’s cobalt trade has been the target of criticism for nearly a decade, and the U.S. Labor Department lists Congolese cobalt as a product it has reason to think is produced by child labor. More troubling, cobalt demand has tripled in the past five years and is projected to at least double again by 2020.

What can be done about EV batteries sourcing issues

First, companies should be held accountable for enacting and enforcing policies to only use ethically-sourced materials. Some companies are off to a good start. Apple has pledged to end its reliance on mining altogether, and one day make its products from only renewable resources or recycled materials. Other tech giants like HP, Samsung, and Sony joined an effort called the “Responsible Cobalt Initiative.” Members of the initiative pledged to follow global guidelines for mining supply chains, which call for companies to trace how cobalt is being extracted, transported, manufactured and sold.

On the EV side of things, Tesla has committed to sourcing materials only from North America for its new battery production facility, the Gigafactory.  In 2015, Tesla secured two contracts with mining companies to explore lithium deposits in northern Nevada and Mexico, though Tesla still relies on cobalt that may have been sourced from the DRC.

Both Ford and GM get their EV batteries from LG Chem, who has said they have stopped using DRC-sourced cobalt and stated that neither Ford nor GM batteries rely on DRC-sourced cobalt, but some of the LG practices and statements have been called into question by the WaPo.

Second, recycling can help reduce the need to search for new source of battery materials, or rely on sourcing materials from countries with poor worker protections. Cobalt, for example, (as opposed to gasoline) is fully recyclable and roughly 15 percent of U.S. cobalt consumption is from recycled scrap today.

Companies like Umicore are in the cobalt recycling business and have demonstrated that there is a business model for recycling cobalt that can help reduce demand for DRC-mined cobalt.

Third, battery technology is continuing to improve. The multitude of battery applications has generated a strong financial incentive for researchers to find the next greatest battery chemistry, and some of the most promising next-gen battery types don’t rely on cobalt at all.

Lithium-titanate and lithium-iron-phosphate, for example, are gaining importance in EV powertrain applications and don’t need cobalt. Other battery chemistries that rely on magnesium, sodium, or lithium-sulfur are also gaining traction as they have the potential to beat lithium-ion batteries on energy density and cost. Battery research has seen a big shift in recent years. Nearly half of the presentations at the Battery Symposium in Japan were once about fuel cells and lithium-ion battery materials. But since 2012, these topics have been supplanted by presentations about solid-state, lithium-air and non-lithium batteries.

Overall, the human rights issues related to the lithium-ion battery supply chain cannot be ignored. At the same time, they shouldn’t be used by the oil industry and their allies as a rallying cry to dismantle EV policy support, or as reason to stop the growth of the EV industry. Again, it’s not just EVs that are at issue here. All manufacturers of electronic devices need to find better sources for their batteries and it is their responsibility to source materials from places that have worker protections. It’s also the responsibility of our government to ensure that Americans can buy products that are ethically and sustainably sourced.

Post-Harvey, the Arkema Disaster Reveals Chemical Safety Risks Were Preventable

UCS Blog - The Equation (text only) -

Halloween is right around the corner, but the Environmental Protection Agency (EPA) has been a perpetual nightmare to public safety since Administrator Scott Pruitt arrived, sending long-awaited chemical safety amendments to an early grave this year. The Risk Management Plan (RMP) is a vital EPA chemical safety rule that “requires certain facilities to develop plans that identify potential effects of a chemical accident, and take certain actions to prevent harm to the public or the environment”—but delays to the effective date of the long-awaited updates are putting communities, workers, and first responders directly in the way of harm, as we have witnessed from recent events following Hurricane Harvey.

Last Friday, the Union of Concerned Scientists released a report finding evidence of harm caused to communities by RMP-related incidents in the wake of Hurricane Harvey. The report serves as a supporting document in a lawsuit UCS is involved with against the EPA decision to delay the long overdue RMP update. Our objective was to further highlight how, if the improvements to the RMP had been allowed to go into place as planned, damage from chemical plants during Hurricane Harvey could have been diminished. We provided an in-depth analysis of the steps the Arkema facility could have taken with the proposed changes in effect, and estimated the toxic burden that the surrounding community was exposed to.

Additionally, we examined other incidents (i.e. spills, releases, explosions) that occurred at chemical facilities during Hurricane Harvey, as well as emphasizing the disproportionate impacts of chemical incidents on communities of color and low-income communities. I have taken “toxic tours” on Houston’s east side with our partners at Texas Environmental Justice Advocacy Services (t.e.j.a.s.), and am all too aware that disparities in distribution of RMP facilities and concentration of toxic pollutants exist and are mostly unnoticed by the unaffected public.

Could Arkema have been avoided?

In late August of this year, Hurricane Harvey unleashed massive quantities of rain upon Houston and surrounding towns, flooding streets, homes—and chemical facilities. As a result, the Arkema plant in Crosby, Texas, a town 25 miles northeast of Houston, was inundated with floodwater and left without power or working generators. This meant the refrigerators needed for cooling volatile organic peroxides were not operational, which ultimately led to the exploding of 500,000 pounds of the unstable chemicals. Though we cannot say Arkema would not have had an incident, we do know the damage inflicted upon first responders and nearby residents could have been mitigated by implementing the revisions to the chemical safety rule, which would have required:

  • Coordination with local emergency response agencies—RMP has standards that would require industries to coordinate and provide information to emergency responders. This would have likely prevented the Crosby first responders from being exposed to noxious fumes—at the perimeter of the ineffective 1.5-mile evacuation zone, I might add—after the Arkema explosion. A group of injured first responders are now suing the plant for failing to properly warn the responders of the dangers they faced.
  • Analysis of safer technologies and chemicals—The facility would have had to begin research into safer technologies and chemicals for use in their facility, including less volatile chemicals, or safer storing and cooling techniques to preempt an explosion.
  • Root cause analyses—A thorough investigation of past incidents to prevent similar future incidents would have been required of RMP facilities.
Communities are at risk

Real life isn’t an action film, where explosions abound and the dogged hero emerges from a fiery building unscathed, nary a casualty to be found.  In real life, the consequences of a chemical explosion, leak, or spill are often dangerous and deadly. We have ways to hold chemical facilities accountable for taking necessary preventative measures, but we must urge the EPA to implement the changes to the proposed RMP rule in order to do so.

Voting Technology Needs an Upgrade: Here’s What Congress Can Do

UCS Blog - The Equation (text only) -

Voting systems throughout the United States are vulnerable to corruption in a variety of ways, and the federal government has an obligation to protect the integrity of the electoral process. At a recent meeting of the National Academies of Sciences, Engineering and Medicine’s Committee on the Future of Voting, the Department of Homeland Security’s Robert Kolasky put it bluntly: “It’s not a fair fight to pit Orange County (California) against the Russians.”

While the intelligence community has not confirmed that the hackers working on behalf of the Russian government to undermine the 2016 election were successful at tampering with actual vote tallies, they certainly succeeded at shaking our confidence in the electoral process, which over time could undermine faith in democracy.

The management of statewide eligible voter lists is a particularly challenging but crucial responsibility. On the one hand, data entry errors, duplicate records and “live” records for deceased voters invite voter fraud and inaccuracies in voting data. On the other hand, overly broad purging of voter lists can result in the exclusion of eligible voters from the rolls.

Two problems with voter list maintenance

Validation of voter eligibility is typically done through “matching” of individuals on voter registration lists with other databases using unique combinations of traits of eligible individuals (birthdays and names, etc.). This process is error-prone in two ways. First, data may not be entered identically for individuals across databases (misspelled names, missing data, etc.), so that individuals fail to get matched and are excluded (false negatives). Second, the computer algorithms used to identify and match records may be imprecise, such that they match the wrong people (false positives) or exclude people from voter lists based on faulty matching techniques (false negatives).

Both assumptions, that matching databases have the correct data, and that the algorithm for identifying individual matches actually does so, have proven challenging. For example, research has shown that the surprisingly high probability that two people in a group of a given size share the same birthday can largely account for inflated estimates of double registrations and people double voting. That is, an algorithm that matches on birthday, and possibly last name, is a poor method for identifying voters, because lots of people share those traits.

But even that poor algorithm assumes that the underlying data is accurate, when it is often not. Even databases containing precisely individualized identifiers, like social security numbers, include enough error to be inappropriate for matching. Indeed, the Social Security Administration accidently declares over 10,000 people dead every year, and attempts to match voter lists with the last four SSN digits have produced error rates above 20%, such that the SSA Inspector General has warned against its use for this purpose.

Sloppy matching algorithms that do not attempt to correct for such data inaccuracies are prone to exclude high numbers of eligible voters. For example, the Crosscheck system, developed by the president’s Electoral “Integrity” Commission Chair Kris Kobach, has actually produced error rates as high as 17% in Chesterfield County, VA, prompting them to abandon the software.

Two solutions that improve voter list management

The solution to these problems is thus twofold: improving the quality of matching algorithms in order to create precise identifiers and overcome data inaccuracies, and reducing the probability of ineligible voters or inaccurate data getting on the voter list to begin with.

Recent advances in algorithmic design have shown that using multiple matching criteria with recoded data to account for common data entry inaccuracies can yield matches that are 99% accurate. For example, Stephen Ansolabehere and Eitan D. Hersh have demonstrated that using three-match combinations of Address (A), Date of Birth (D), Gender (G) and Name (N), or ADGN, is extremely effective in successful matching (and helps explain how Facebook knows everything about you).

For securing and maintaining precise voter list data from the start, the implementation of automatic voter registration, or AVR, is proving increasingly effective and popular. By automatically registering all eligible adults (unless they decline) when people update their own data through government agencies, and by transferring that data electronically on a regular basis, the process “boosts registration rates, cleans up the rolls, makes voting more convenient, and reduces the potential for voter fraud, all while lowering costs” according to the Brennan Center for Justice, which advocates AVR.

Ten states and the District of Columbia have already approved AVR, and 32 states have introduced AVR proposals this year. It is a politically bi-partisan solution, with states as different as California and Alaska having already adopted the practice.
Road Island’s Democratic Secretary of State, Nellie Gorbea, has stated that “Having clean voter lists is critical to preserving the integrity of our elections, which is why I made enacting Automatic Voter Registration a priority.”

Republican Governor of Illinois Bruce Rauner, on signing his state’s AVR law, explained that “This is good bipartisan legislation and it addresses the fundamental fact that the right to vote is foundational for the rights of Americans in our democracy.”

Given the seriousness of the threat, and the fact that such effective solutions for voter list management have already been developed, Congress should ensure that states have the capacity to implement these policies, which are among the most important infrastructure investments that we can make.

Memo to EPA Chief Pruitt: End Subsidies for Fossil Fuels, Not Renewables

UCS Blog - The Equation (text only) -

Economically and environmentally, it would be far better for the future of the planet to phase out fossil fuel subsidies and provide more incentives for clean energy. Photo: Union of Concerned Scientists

Environmental Protection Agency Administrator Scott Pruitt recently proposed eliminating federal tax credits for wind and solar power, arguing that they should “stand on their own and compete against coal and natural gas and other sources” as opposed to “being propped up by tax incentives and other types of credits….”

Stand on their own? Pruitt surely must be aware that fossil fuels have been feasting at the government trough for at least 100 years. Renewables, by comparison, have received support only since the mid-1990s and, until recently, have had to subsist on scraps.

Perhaps a review of the facts can set administrator Pruitt straight. There’s a strong case to be made that Congress should terminate subsidies for fossil fuels and extend them for renewables, not the other way around.

A century (or two) of subsidies

To promote domestic energy production, the federal government has been serving the oil and gas industry a smorgasbord of subsidies since the early days of the 20th century. Companies can deduct the cost of drilling wells, for example, as well as the cost of exploring for and developing oil shale deposits. They even get a domestic manufacturing deduction, which is intended to keep US industries from moving abroad, even though—by the very nature of their business—they can’t move overseas.

All told, from 1918 through 2009, the industry’s tax breaks and other subsidies amounted to an average of $4.86 billion annually (in 2010 dollars), according to a 2011 study by DBL Investors, a venture capital firm. Accounting for inflation, that would be $5.53 billion a year today.

The DBL study didn’t include coal due to the lack of data for subsidies going back to the early 1800s, but the federal government has lavished considerably more on the coal industry than on renewables. In 2008 alone, coal received between $3.2 billion and $5.4 billion in subsidies, according to a 2011 Harvard Medical School study in the Annals of the New York Academy of Sciences.

Meanwhile, wind and other renewable energy technologies, DBL found, averaged only $370 million a year in subsidies between 1994 and 2009, the equivalent of $421 million a year today. The 2009 economic stimulus package did provide $21 billion for renewables, but that support barely began to level the playing field that has tilted in favor of oil and gas for 100 years and coal for more than 200.

A 2009 study by the Environmental Law Institute looked at US energy subsidies since the turn of this century. It found that between 2002 and 2008, the federal government gave fossil fuels six times more than what it gave solar, wind, and other renewables. Coal, natural gas, and oil benefited from $72.5 billion in subsidies (in 2007 dollars) over that seven-year period, while “traditional” renewable energy sources—mainly wind and solar—received only $12.2 billion. A pie chart from the report shows that 71 percent of federal subsidies went to coal, natural gas and oil, 17 percent—$16.8 billion—went to corn ethanol, and the remaining 12 percent went to traditional renewables.

A new study by Oil Change International brings us up-to-date. Published earlier this month, it found that federal subsidies in 2015 and 2016 averaged $10.9 billion a year for the oil and gas industry and $3.8 billion for the coal industry. By contrast, the wind industry’s so-called production tax credit, renewed by Congress in December 2015, amounted to $3.3 billion last year, according to a Congress Joint Committee on Taxation (JCT) estimate.

Unlike the fossil fuel industry’s permanent subsidies, Congress has allowed the wind tax credit to expire six times in the last 20 years, and it is now set to decline incrementally until ending in 2020. Similarly, Congress fixed the solar industry’s investment tax credit at 30 percent of a project’s cost through 2019, but reduced it to 10 percent for commercial projects and zeroed it out for residences by the end of 2021. The JCT estimates that the solar credit amounted to a $2.4-billion tax break last year. Totaling it up, fossil fuels—at $14.7 billion—still received two-and-a-half times more in federal support than solar and wind in 2016.

The costs of pollution

Subsidy numbers tell only part of the story. Besides a century or two of support, the federal government has allowed fossil fuel companies and electric utilities to “externalize” their costs of production and foist them on the public.

Although coal now only generates 30 percent of US electricity, down from 50 percent in 2008, it is still responsible for two-thirds of the electric utility sector’s carbon emissions and is a leading source of toxic pollutants linked to cancer; cardiovascular, respiratory, and neurological diseases; and premature death. The 2011 Harvard Medical School study cited above estimated coal’s “life cycle” cost to the country—including its impact on miners, public health, the environment and the climate—at $345 billion a year.

In July 2016, the federal government finally began regulating the more than 1,400 coal ash ponds across the country containing billions of gallons of heavy metals and other byproducts from burning coal. Coal ash, which has been leaching and spilling into local groundwater, wetlands, creeks, and rivers, can cause cancer, heart, and lung disease, birth defects and neurological damage in humans, and can devastate bird, fish, and frog populations.

But that was last year. Since taking office, the Trump administration has been working overtime to bolster coal, which can no longer compete economically with natural gas or renewables. Earlier this year, it rescinded a rule that would have protected waterways from mining waste, and a few months ago it filed a repeal of another Obama-era measure that would have increased mineral royalties on federal lands. More recently, Energy Secretary Rick Perry asked the Federal Energy Regulatory Commission to ensure that coal plants can recover all of their costs, whether those plants are needed or not.

Natural gas burns more cleanly than coal, but its drilling sites, processing plants, and pipelines leak methane, and its production technique—hydraulic fracturing—can contaminate water supplies and trigger earthquakes. Currently the fuel is responsible for nearly a third of the electric utility sector’s carbon emissions. Meanwhile, the US transportation sector—whose oil-powered engine exhaust exacerbates asthma and likely causes other respiratory problems and heart disease—is now the nation’s largest carbon polluter, edging out the electric utility sector last year for the first time since the late 1970s.

Like the coal industry, the oil and gas industry has friends in high places. Thanks to friendly lawmakers and administrations, natural gas developers are exempt from key provisions of seven major environmental laws that protect air and water from toxic chemicals. Permitting them to flout these critical safeguards forces taxpayers to shoulder the cost of monitoring, remediation, and cleanup—if they happen at all.

The benefits of clean energy

Unlike fossil fuels, wind and solar energy do not emit toxic pollutants or greenhouse gases. They also are not subject to price volatility: wind gusts and solar rays are free, so more renewables would help stabilize energy prices. And they are becoming less expensive, more productive, and more reliable every year. According to a recent Department of Energy (DOE) report, power from new wind farms last year cost a third of wind’s price in 2010 and was cheaper than electricity from natural gas plants.

Perhaps the biggest bonus of transitioning to a clean energy system, however, is the fact that the benefits of improved air quality and climate change mitigation far outweigh the cost of implementation, according to a January 2016 DOE study. Conducted by researchers at the DOE’s Lawrence Berkeley National Laboratory and National Renewable Energy Laboratory, the study assessed the impact of standards in 29 states and the District of Columbia that require utilities to increase their use of renewables by a certain percentage by a specific year. Called renewable electricity (or portfolio) standards, they range from California and New York’s ambitious goals of 50 percent by 2030 to Wisconsin’s modest target of 10 percent by 2015.

It turns out that it cost utilities nationwide approximately $1 billion a year between 2010 and 2013—generally the equivalent of less than 2 percent of average statewide retail electricity rates—to comply with the state standards. On the benefit side of the equation, however, standards-spawned renewable technologies in 2013 alone generated $7.4 billion in public health and other societal benefits by reducing carbon dioxide, sulfur dioxide, nitrogen oxide, and particulate matter emissions. They also saved consumers as much as $1.2 billion by lowering wholesale electricity prices and as much as $3.7 billion by reducing natural gas prices, because more renewable energy on the grid cuts demand—and lowers the price—of natural gas and other power sources that have higher operating costs.

Take fossil fuels off the dole

If the initial rationale for subsidizing fossil fuels was to encourage their growth, that time has long since passed. The Center for American Progress (CAP), a liberal think tank, published a fact sheet in May 2016 identifying nine unnecessary oil and gas tax breaks that should be terminated. Repealing the subsidies, according to CAP, would save the US Treasury a minimum of $37.7 billion over the next 10 years.

An August 2016 report for the Council on Foreign Relations by Gilbert Metcalf, an economics professor at Tufts University, concluded that eliminating the three major federal tax incentives for oil and gas production would have a relatively small impact on production and consumption. The three provisions—deductions for “intangible” drilling costs, deductions for oil and gas deposit depletion, and deductions for domestic manufacturing—account for 90 percent of the cost of the subsidies. Ending these tax breaks, Metcalf says, would save the Treasury roughly $4 billion a year and would not appreciably raise oil and gas prices.

At the same time, the relatively new, burgeoning clean energy sector deserves federal support as it gains a foothold in the marketplace. Steve Clemmer, energy research director at the Union of Concerned Scientists, made the case in testimony before a House subcommittee last March that Congress should preserve wind and solar tax incentives beyond 2020.

“Until we can transition to national policies that provide more stable, long-term support for clean, low-carbon energy,” he said, “Congress should extend federal tax credits by at least five more years to maintain the sustained orderly growth of the industry and provide more parity and predictability for renewables in the tax code.” Clemmer also recommended new tax credits for investments in low- and zero-carbon technologies and energy storage technologies.

Despite the steady barrage of through-the-looking-glass statements by Trump administration officials, scientific and economic facts still matter. Administrator Pruitt would do well to examine them. Congress should, too, when it considers its tax overhaul bill, which is now being drafted behind closed doors. If they did, perhaps they would recognize that—economically and environmentally—it would be far better for the future of the planet to phase out fossil fuel subsidies and provide more incentives for clean energy.

Trashing Science in Government Grants Isn’t Normal: The Case of the EPA, John Konkus, and Climate Change

UCS Blog - The Equation (text only) -

There is now a political appointee of the Trump administration at the Environmental Protection Agency (EPA), John Konkus, reviewing grant solicitations and proposals in the public affairs office. It has been reported that Konkus is on the lookout for any reference to “climate change” in grant solicitations in attempt to eliminate this work from the agency’s competitive programmatic grants. So, is this normal?

Grants and government

The US Federal Government gave out nearly $663 billion in grant funding in fiscal year 2017. Such funding pays for a wide range of state and local government services, such as health care, transportation, income security, education, job training, social services, community development, and environmental protection. Additionally, approximately $40 billion in grant funding from federal agencies funds scientific research annually, although the amount of funding for research and development from the federal government has declined in recent years.

Given the large amount of grant funding that the federal government gives out annually, it is critical that the government has:  1) guidelines that provide guidance on what type of work the government grant will fund, and 2) a process for determining who receives funding. While each grant is unique in its considerations of what makes a good candidate for funding, there is a relatively standard process through which government grants are advertised and funded. The majority of this information can be found at www.grants.gov.

The grant solicitation

The first step in the process for funding of scientific grants is for a government agency to solicit proposals from interested parties (i.e., scientists working outside the government). The US Federal Government refers to these solicitations as “Funding Opportunity Announcements” or FOAs. These FOAs include information about what type of work the agency is expecting and whether or not the applicant would be eligible for funding. Thus, an FOA is extremely important for both the government and the applicant because it highlights the agency’s priorities for the funding, which also serves as a guideline for an applicant’s proposal.

The agency must first consider what type of work is currently needed in the US. In the case of science, the agency assesses what is currently unknown in our scientific knowledge on a given subject. Additionally, agencies will determine what special considerations are needed to make the grant work more impactful—these may include work that focuses on environmental justice or coal communities, for example. These considerations are typically discussed in the FOAs, and grants that include these special considerations in their proposals are typically ranked as more competitive relative to others that do not.

The FOA is reviewed by a panel of experts, which consist of career officials across the federal government for most agencies. It isn’t uncommon for political appointees to review an FOA. Political appointees generally broaden the FOA so it’s more inclusive, asking questions such as, “do you think that we might want to consider adding a special consideration for communities recently affected by natural disasters?” What does seem to be uncommon is eliminating scientifically defensible language like the “double C word.”

Reviewing grant proposals and awards

At many federal agencies, grants are reviewed by career agency staffers who have expertise for the grant program. However, in the cases of the National Science Foundation and the National Institutes of Health, a panel of non-federal scientists who have scientific expertise in the relevant research areas and scientific disciplines review submitted proposals. All proposed federal grants typically go through a first round review where they are screened for eligibility. If the proposal does not meet eligibility criteria, it is not reviewed further.

Those proposals that are eligible for funding are then reviewed by a panel of career agency staffers who are experts for the grant program’s work. The proposals are evaluated based on criteria specific to the grant – for some programmatic grants these criteria are dictated by statutory authority (e.g., grants in the brownfields program at the EPA). Based on these criteria, the panel scores each proposal. The proposals that receive higher scores are deemed more competitive relative to those with lower scores.

Depending on the amount of funding available for a grant program, the panel will recommend a percentage of the top scoring grants to be funded. The panel also takes into consideration other factors that may have been emphasized in the FOA (e.g., a community that was just ravaged by a natural disaster that is in greater need of funding relative to other communities).

The recommended set of proposals for funding are then sent to the head of the program, which can be a political appointee of a presidential administration. The amount of information on recommendations that the appointee might receive varies. Sometimes the appointee might receive abstracts of proposals or they might just receive a list of the institutions or researchers recommended for funding. The appointee typically agrees with the recommendation of the expert panel. It would be uncommon for the political appointee to not fund a proposal recommended for funding, as is being done by Konkus.

Ignoring science in grants will harm others

Is it uncommon for political appointees to have a say in the grant funding process? No. What is uncommon is for political appointees to politicize science in grant solicitation language or in rescinding proposals that were recommended by a panel of experts. As former EPA administration Christine Todd Whitman chimed in on this issue, “We didn’t do a political screening on every grant, because many of them were based on science, and political appointees don’t have that kind of background.”

As is common with this administration, we are seeing proposals that mention the “double C word” as a target. Konkus rescinded funding from EPA to two organizations that would have supported the deployment of clean cookstoves in the developing world—a simple solution to curb the impacts of climate change, but also to limit pollution that disproportionately affects women and children in these areas. Who knows what Konkus will rescind next, but it’s likely to have harmful effects on people. Maybe Konkus should leave decisions for funding up to the expert panels. They are categorized as experts for a reason.

The 5 Worst Plays From Industry’s Disinformation Playbook

UCS Blog - The Equation (text only) -

When I was 13, this is what I identified as the hardest thing about life then. My trust issues were just beginning to manifest themselves.

I have always had a healthy dose of curiosity and skepticism and a desire to hold people accountable for their statements built into my DNA. Usually, these were borne out in letter-writing campaigns. As a child, I sent a series of letters to the Daily News because I believed its campaign of “No More Schmutz!” was falling short after rifling through the pages and still having gray smudges on my fingers. Inky fingers is a far cry from misinformation about the dangers of fossil fuel pollution, but overall, my general pursuit for the truth hasn’t changed.

My newest project at the Center for Science and Democracy released today is a website that exposes the ways in which companies seek to hide the truth about the impacts of their products or actions on health and the environment. By calling out the plays in the “Disinformation Playbook,” we hope to ensure that powerful companies and institutions are not engaging in behavior that would obstruct the government’s ability to keep us safe, and at the very least aren’t doing us any harm. Unfortunately, as our case studies show, there are far too many examples in which companies and trade organizations have made intentional decisions to delay or obstruct science-based safeguards, putting our lives at risk.

In the Disinformation Playbook, we reveal the five tactics that some companies use to manipulate, attack, or create doubt about science and the scientists conducting important research that appears unfavorable to a company’s products or actions. We also feature twenty case studies that show how companies in a range of different industries have used these tactics in an effort to sideline science.

While not all companies engage in this irresponsible behavior, the companies and trade associations we highlight in the playbook have acted in legally questionable and ethically unsound ways. Here are five of the most egregious examples from the Playbook:

The Fake: Conducting counterfeit science

In an attempt to reduce litigation costs, Georgia-Pacific secretly ran a research program intended to raise doubts about the dangers of asbestos and stave off regulatory efforts to ban the chemical. The company used knowingly flawed methodologies in its science as well as publishing its research in scientific journals without adequately disclosing the authors’ conflicts of interest. By seeding the literature with counterfeit science, Georgia-Pacific created a life-threatening hazard by deceiving those who rely on science to understand the health risks of asbestos exposure. While asbestos use has been phased out in the US, it is not banned, and mesothelioma still claims the lives of thousands of people very year.

The Blitz: Harassing scientists

Rather than honestly dealing with its burgeoning concussion problem, the National Football League (NFL) went after the reputation of the first doctor to link the sport to the degenerative brain disease he named Chronic Traumatic Encephalopathy. What Omalu found in Mike Webster’s brain—chronic traumatic encephalopathy (CTE), a progressive degenerative disease mainly associated with “punch drunk” boxers and victims of brain trauma—broke the NFL’s burgeoning concussion problem wide open. But instead of working with scientists and doctors to better understand the damaging effect of repeated concussions and how the league could improve the game to reduce head injuries, the NFL went after the reputation of Omalu and the other scientists who subsequently worked on CTE. Just this year, Boston University scientists released a study of 111 deceased former NFL players’ brains, revealing that all but one had signs of CTE.

The Diversion: Sowing uncertainty

The top lobbyist for the fossil fuel industry in the western United States secretly ran more than a dozen front groups in an attempt to undermine forward-looking policy on climate change and clean technologies. As a leaked 2014 presentation by Western States Petroleum Association (WSPA) President Catherine Reheis-Boyd revealed, WSPA’s strategy was to use these fabricated organizations to falsely represent grassroots opposition to forward-looking policy on climate change and clean technologies. WSPA and its member companies oppose science-based climate policies that are critically needed to mitigate the damaging impacts of global warming.

The Screen: Borrowing credibility

Coca-Cola quietly funded a research institute out of the University of Colorado designed to persuade people to focus on exercise, not calorie intake, for weight loss. Emails between the company and the institute’s president suggested Coca-Cola helped pick the group’s leaders, create its mission statement, and design its website. A growing body of scientific evidence links sugar to a variety of negative health outcomes, including diabetes, cardiovascular disease, and high cholesterol. Coca-Cola’s actions overrode sensible transparency safeguards meant to ensure the independence of research—and allow consumers to understand the risks of sugar consumption for themselves.

The Fix: Manipulating government officials

After meeting with and listening to talking points from chlorpyrifos producer Dow Chemical Company, the EPA announced it would be reversing its decision to ban the chemical that is linked to neurological development issues in children. In 2016, the EPA concluded that chlorpyrifos can have harmful effects on child brain development. The regulation of chlorpyrifos is additionally an environmental justice issue. Latino children in California are 91 percent more likely than white children to attend schools near areas of heavy pesticide use.

The secret to a good defense is a good offense

By arming ourselves with independent science, we can fight back against these familiar tactics. Granted, it’s not an easy task, especially in the face of a government run by an administration that doesn’t appear to value evidence, believing asbestos is 100% safe and claiming that climate change is a hoax. I hear powerful stories every day of communities working together to crush corporate disinformation campaigns with the hard truth.

Just a couple of weeks ago, community members from Hoosick Falls, New York attended the hearing for “toxicologist-for-hire,” Dr. Michael Dourson, the nominee to head up the EPA’s Office of Chemical Safety and Pollution Prevention. Senator Kristen Gillibrand paid homage to their bravery, “The water that they drink, the water they give their children, the water they cook in, the water they bathe in, is contaminated by PFOA. These families are so frightened.” These individuals had a powerful story to tell about the way in which DuPont downplayed the dangers of the chemical byproduct of Teflon, C8 or PFOA, and Dourson’s consulting firm, hired by DuPont, recommended a far lower standard for the chemical than most scientists believe would have protected exposed residents from harm.

We hope that reading through the playbook will encourage you to stand up for science and join us as we continue to challenge companies that attempt to sideline science, seeking business as usual at our expense. Become a science champion today and take a stand against corporate disinformation by asking your members of congress not to do automakers’ bidding by rolling back valuable progress on vehicle efficiency standards.

Stay curious, stay skeptical, and together we can work toward making corporate culture more honest and transparent by raising the political cost of using the disinformation playbook.

Up Close with America’s New Renewable Energy: Experiencing the Now-ness of Offshore Wind

UCS Blog - The Equation (text only) -

Block Island Wind Farm. Photo: E. Spanger-Siegfried

On a recent clear day, colleagues and I hopped on a boat for a look at our nation’s energy future. From right up close, offshore wind turbines make quite an impression. The biggest impression, though? That the future of energy… is actually right now.

Seeing is believing

The boat tour gave us a chance to be out on the water in the vicinity of the turbines of Rhode Island’s Block Island Wind Farm, the first offshore wind facility in the Americas. And what first stood out in that trip was… well, the wind turbines.

Block Island Wind Farm: Seeing is believing. Photo: J. Rogers.

Sight. Yes, these things are no shrinking violets. The mechanical engineer in me is drawn inexorably to the stats that define that heft, facts about the size of each the five 6-megawatt turbines that make up the wind farm. About the lengths/heights—of the towers (360 feet up from the ocean’s surface), the foundation (90 feet down to the seabed, then 200 feet beyond), the blades (240 feet from hub to tip). About the weight—1500 tons for the foundation, 800 more for the tower, the nacelle (the box up top), and the blades.

The poet in me, if there were one, would wax lyrical (and poetical) about the visuals of the trip. I can at least say this: I know that beauty is in the eye of the beholder, but this beholder was quite taken with the towering figures just feet away as we motored by, and, as far as I could tell, my fellow travelers/beholders shared that sentiment.

The turbines don’t just look solid and mechanical and useful. They look like art—graceful, kinetic sculptures rising from the briny depths.

Beyond seeing, and seeing beyond

This tour wasn’t just about seeing, though. With a trip this exciting, you want to bring multiple senses to bear, and so we did.

Offshore wind power – Big, bold, beautiful, and ready for its close-up. Photo: E. Spanger-Siegfried.

Sound. Surprisingly, given the size of each installation, sound was not really a piece of the turbine-gazing experience. That is, I could maybe hear the blades turning, but only barely, over the noise of the ship’s engine and, particularly, over the sound from the very wind that was exciting those blade rotations.

Scent. The scent on the water was of the sea air, which I don’t normally get and which I’d welcome any day. When you get close enough to see the bolts and welds on the foundations and towers, though, these wind turbines smell like jobs.

The workmanship that went into these marvels is clear. Looking at each, you can easily imagine the workers, local, abroad, and in-between, that made this possible.

While many of the major components for this first-in-the-nation wind farm came from factories in established offshore wind farm markets, it was welders in Louisiana who gave birth to the foundation, using manufacturing skills wisely transferred from the offshore oil/gas industry. And the pieces all came together courtesy of ironworkers, electricians, and more in Rhode Island—some 300 local workers, says project developer Deepwater Wind.

Offshore wind admirers. Photo: J. Rogers.

Touch. Much as I would have enjoyed getting right on the turbines (and maybe even on top?), our passage by understandably left us a few tens of feet short of that. (Next time.)

But my fellow travelers and I were clearly touched by the experience of seeing such power right up close, could easily feel the transformative energy of each turbine.

Taste. That leaves one more sense. This trip wasn’t just about the taste of the salty air. It communicated the sense that what we got on the water on that recent fall day was just a taste of what’s to come. Maybe, then, we can couple that with a sixth sense: a sense of optimism.

Because it’s hard to stand there on the rising-falling deck, with the sun, the wind, and the sea spray, with those powerful sculptures so close by, and not get a sense that you’re witnessing a special something. A something that goes beyond five turbines, big as they are, and beyond 30 megawatts and the 17,000 homes that they can power. A sense that’s there much more beyond.

One of the local leaders from the electricians union (IBEW) captured this beyond idea well in talking about the project from the point of view of jobs, and the economic development potential of this technology:

Offshore wind: The future is present. Photo: J. Rogers.

“The real prize was not the five turbines… The real prize is what’s going to come.”

When it comes to offshore wind turbines, the what’s-to-come seems as big and powerful as each turbine multiplied many-fold. We seem poised for so much more, not just abroad, but right here at home.

A video of the Block Island project from proud project financier Citi can get you close to this particular project, and this cool 360 version of the turbines courtesy of the New York Times can get you even closer (just hold on tight!).

But for readers in this country, the fact that we’re poised for much more means that a chance to visit a wind farm in waters near you could be coming soon.

And if you do get there, use as many senses as you can. Offshore wind power is an experience worth getting close to, and opening up to.

The print version of Citi’s Block Island promotion includes the tagline “On a clear day you can see the future”. But getting up close to offshore wind turbines makes it clear that this particular energy technology is here and now. That it’s so ready for the big time. That yesterday’s energy future is today’s energy present.

So go ahead, on clear days, or cloudy, rain or shine: See, hear, smell, touch, and taste that energy-future-in-the-present. And celebrate.

Why NRC Nuclear Safety Inspections are Necessary: Indian Point

UCS Blog - All Things Nuclear (text only) -

This is the second in a series of commentaries about the vital role nuclear safety inspections conducted by the Nuclear Regulatory Commission (NRC) play in protecting the public. The initial commentary described how NRC inspectors discovered that limits on the maximum allowable control room air temperature at the Columbia Generating Station in Washington had been improperly relaxed by the plant’s owner. This commentary describes a more recent finding by NRC inspectors about an improper safety assessment of a leaking cooling water system pipe on Entergy’s Unit 3 reactor at Indian Point outside New York City.

Indian Point Unit 3: Leak Before Break

On February 3, 2017, the NRC issued Indian Point a Green finding for a violation of Appendix B to 10 CFR Part 50. Specifically, the owner failed to perform an adequate operability review per its procedures after workers discovered water leaking from a service water system pipe.

On April 27, 2016, workers found water leaking from the pipe downstream of the strainer for service water (SW) pump 31. As shown in Figure 1, SW pump 31 is one of six service water pumps located within the intake structure alongside the Hudson River. The six SW pumps are arranged in two sets of three pumps. Figure 1 shows SW pumps 31, 32, and 33 aligned to provide water drawn from the Hudson River to essential (i.e, safety and emergency) components within Unit 3. SW pumps 34, 35, and 36 are aligned to provide cooling water to non-essential equipment within Unit 3.

Fig. 1 (Source: Nuclear Regulatory Commission Plant Information Book) (click to enlarge)

Each SW pump is designed to deliver 6,000 gallons of flow. During normal operation, one SW pump can handle the essential loads while two SW pumps are needed for the non-essential loads. Under accident conditions, two SW pumps are needed to cool the essential equipment. The onsite emergency diesel generators can power either of the sets of three pumps, but not both simultaneously. If the set of SW pumps aligned to the essential equipment aren’t getting the job done, workers can open/close valves and electrical breakers to reconfigure the second set of three SW pumps to the essential equipment loops.

Because river water can have stuff in it that could clog some of the coolers for essential equipment, each SW pump has a strainer that attempts to remove as much debris as possible from the water. The leak discovered on April 27, 2016, was in the piping between the discharge check valve for SW pump 31 and its strainer. An arrow points to this piping section in Figure 1. The strainers were installed in openings called pits in the thick concrete floor of the intake structure. Water from the leaking pipe flowed into the pit housing the strainer for SW pump 31.

The initial leak rate was modest—estimated to be about one-eighth of a gallon per minute. The leak was similar to other pinhole leaks that had occurred in the concrete-lined, carbon steel SW pipes. The owner began daily checks on the leakage and prepared an operability determination. Basically, “operability determinations” are used within the nuclear industry when safety equipment is found to be impaired or degraded. The operability determination for the service water pipe leak concluded that the impairment did not prevent the SW pumps from fulfilling their required safety function. The operability determination relied on a sump pump located at the bottom of the strainer pit transferring the leaking water out of the pit before the water flooded and submerged safety components.

The daily checks instituted by the owner included workers recording the leak rate and assessing whether it had significantly increased. But the checks were against the previous day’s leak rate rather than the initial leak rate. By September 18, 2016, the leakage had steadily increased by a factor of 64 to 8 gallons per minute. But the daily incremental increases were small enough that they kept workers from finding the overall increase to be significant.

The daily check on October 15, 2016, found the pump room flooded to a depth of several inches. The leak rate was now estimated to be 20 gallons per minute. And the floor drain in the strainer pit was clogged (ironic, huh?) impairing the ability of its sump pump to remove the water. Workers placed temporary sump pumps in the room to remove the flood water and cope with the insignificantly higher leak rate. On October 17, workers installed a clamp on the pipe that reduced the leakage to less than one gallon per minute.

The operability determination was revised in response to concerns expressed by the NRC inspectors. The NRC inspectors were not satisfied by the revised operability determination. It continued to rely on the strainer pit sump pump removing the leaking water. But that sump pump was not powered from the emergency diesel generator and thus would not remove water should offsite power become unavailable. Step 5.6.4 of procedure EN-OP-14, “Operability Determination Process,” stated “If the Operability is based on the use or availability of other equipment, it must be verified that the equipment is capable of performing the function utilized in the evaluation.”

The operability determination explicitly stated that no compensatory measures or operator manual actions were needed to handle the leak, but the situation clearly required both compensatory measures and operator manual actions.

The NRC inspectors found additional deficiencies in the revised operability determination. The NRC inspectors calculated that a 20 gallon per minute leak rate coupled with an unavailable strainer pit sump pump would flood the room to a depth of three feet in three hours. There are no flood alarms in the room and the daily checks might not detect flooding until the level rose to three feet. At that level, water would submerge and potentially disable the vacuum breakers for the SW pumps. Proper vacuum breaker operation could be needed to successfully restart the SW pumps.

The NRC inspectors calculated that the 20 gallon per minute leak rate without remediation would flood the room to the level of the control cabinets for the strainers in 10 hours. The submerged control cabinets could disable the strainers, leading to blocked cooling water flow to essential equipment.

The NRC inspects calculated that the 20 gallon per minute leak rate without remediation would completely fill the room in about 29 hours, or only slightly longer than the daily check interval.

Flooding to depths of 3 feet, 10 feet, and the room’s ceiling affected all six SW pumps. Thus, the flooding represented a common mode threat that could disable the entire service water system. In turn, all safety equipment shown in Figure 2 no longer cooled by the disabled service water system could also be disabled. The NRC estimated that the flooding risk was about 5×10-6 per reactor year, solidly in the Green finding band.

Fig. 2 (Source: Nuclear Regulatory Commission Plant Information Book) (click to enlarge)

UCS Perspective

“Leak before break” is a longstanding nuclear safety philosophy. Books have been written about it (well, at least one report has been written and may even have been read.)  The NRC’s approval of a leak before break analysis can allow the owner of an existing nuclear power reactor to remove pipe whip restraints and jet impingement barriers. Such hardware guarded against the sudden rupture of a pipe filled with high pressure fluid from damaging safety equipment in the area. The leak before break analyses can provide the NRC with sufficient confidence that piping degradation will be detected by observed leakage with remedial actions taken before the pipe fails catastrophically. More than a decade ago, the NRC issued a Knowledge Management document on the leak before break philosophy and acceptable methods of analyzing, monitoring, and responding to piping degradation.

This incident at Indian Point illustrated an equally longstanding nuclear safety practice of “leak before break.” In this case, the leak was indeed followed by a break. But the break was not the failure of the piping but failure of the owner to comply with federal safety regulations. Pipe breaks are bad. Regulation breaks are bad. Deciding which is worse is like trying to decide which eye one wants to be poked in. None is far better than either.

As with the prior Columbia Generating Station case study, this Indian Point case study illustrates the vital role that NRC’s enforcement efforts plays in nuclear safety. Even after NRC inspectors voiced clear concerns about the improperly evaluated service water system pipe leak, Entergy failed to properly evaluate the situation, thus violating federal safety regulations. To be fair to Entergy, the company was probably doing its best, but in recent years, Entergy’s best has been far below nuclear industry average performance levels.

The NRC’s ROP is the public’s best protection against hazards caused by aging nuclear power reactors, shrinking maintenance budgets, emerging sabotage threats, and Entergy. Replacing the NRC’s engineering inspections with self-assessments by Entergy would lessen the effectiveness of that protective shield.

The NRC must continue to protect the public to the best of its ability. Delegating safety checks to owners like Entergy is inconsistent with that important mission.

Xi’s China

UCS Blog - All Things Nuclear (text only) -

What’s happening in China? The US consensus seems to be that President Xi Jinping is upending the place. Yet, midway through an expected ten-year term China’s communist party general secretary delivered a report to the 19th Party Congress that reiterated all the language, ideas and policies that the Chinese communists have used to govern the country since the mid-1980s. The most remarkable thing about Xi’s China is that it hasn’t changed at all.

Chinese Communist Party General Secretary Xi Jinping addresses the 19th Party Congress

China remains a socialist country. Xi’s not only proud of that, he’s confident that continuing to follow the socialist road will put China on the right side of history. What makes his tenure at the top seem different is that he’s unapologetically elevated ideology over policy. In Chairman Mao’s parlance, Xi is a little more red than expert.

But that doesn’t mean he’s changed Chinese policy. Internationally, Xi reported China remains open to the outside world. Domestically, his government remains committed to economic and political reform. It may not be the kind of openness or the type of reform US officials hoped for, but US expectations for China have always been based on a different view of history. Even after the Chinese leadership used lethal military force to suppress nationwide public demonstrations in June of 1989, most US observers still believed that international engagement, market economics and the rise of the Chinese middle class would eventually lead to the fall of the Chinese Communist Party (CCP) and the emergence of a multi-party Chinese democracy. Instead, if Xi’s report is to be believed, Chinese socialism has emerged from the crucible of Tiananmen Square stronger than it was before.

Continuity and Change in Communist China

The last time China really changed was when Mao died. Mao believed that global revolution was right around the corner and that China was ready for a rapid transformation to communism. The leaders who inherited the party in Mao’s wake, especially Deng Xiaoping, saw the world and China’s place within it very differently. At home, China was only in the beginning stages of a transformation to socialism that would take a very long time. And as the party set about engineering that incremental transformation, China would need to engage the world as it was rather than imagining they would change it. Deng told his comrades they needed to be humble as they worked to fulfill their Chinese socialist dream to modernize the country and restore Chinese influence in the world.

Xi Jinping’s report does not stray too far from that advice. China’s made a lot of progress since Deng died twenty years ago, but it is still, according to Xi, in the early stages of a long-term transformation to socialism. China’s progress may have elevated its position in the world, and given China a greater say in international governance, but there is nothing in Xi’s report about China leading a movement to upend the global status quo.

Xi does believe that Chinese socialism can set an example for the rest of the world to follow, and that more active Chinese participation can help transform the international order. As a committed Marxist, Xi should believe an eventual transition to a socialist global order is inevitable. But in the short term, Xi’s China appears squarely focused on the fifth of humanity that lives within its borders, where good governance is at a crossroads, crippled by endemic corruption rooted in the attitudes and behavior of party cadres who’ve lost the faith. Xi’s project, if you take his party congress report at face value, seems to be to save Chinese socialism and consolidate its gains, not to change it.

Implications for the United States

Is a consolidated and internationally persuasive Chinese socialism a threat to the United States? Unfortunately, that’s a question many US analysts and officials are no longer inclined to address. During the Maoist era, when China was “more red than expert,” there was greater US interest in the content of Chinese socialism. Today, US observers tend to view the CCP leadership’s repeated recitations of its socialist principles and practices as propaganda masking personal or national ambitions.

US commentaries on Xi’s speech reflect this. Most of them interpret Xi’s campaign against corruption as a personal quest to consolidate power rather than a campaign to save Chinese socialism. Instead of taking Xi and his recent predecessors at their word and seeing the principal aim of their post-1980s efforts as the achievement of a “moderate level of prosperity” for China‘s 1.4 billion, many US observers see this as an attempt to hide the CCP’s real aim, which they believe is kicking the United States out of Asia and supplanting US dominance of the region. For Americans, the contest between the United States and China is perceived as an historic struggle between rising and falling national powers rather than competing ideologies.

If Xi is a budding dictator leading a nationalist political organization focused on replacing the United States at the top of a global hierarchy then US policy makers should be concerned. But what if the Chinese dream articulated in Xi’s report to the 19th Party Congress is a fair representation of the CCP’s ambitions? Should the United States be alarmed? The answer is not obvious and the question seems to deserve greater consideration.

Michael Dourson: A Toxic Choice for Our Health and Safety

UCS Blog - The Equation (text only) -

When it comes to conflicts of interest, few nominations can top that of Michael Dourson to lead the EPA’s Office of Chemical Safety and Pollution Prevention. Time after time, Dourson has worked for industries and against the public interest, actively downplaying the risks of a series of chemicals and pushing for less stringent policies that threaten our safety.

In short, Dourson pushes counterfeit science, is unfit to protect us from dangerous chemicals, and is a toxic choice for our health and safety.

A litany of toxic decisions

Consider the 2014 Freedom Industries chemical spill into the Elk River in Charleston, West Virginia, which contaminated drinking water supplies for 300,000 people with MCHM and PPH—two chemicals used to clean coal.

After the spill, Dourson’s company, TERA, was hired by the state to put together a health effects panel; Dourson was the chair. He did not disclose the fact, however, that he had been hired to work for the very same companies that manufactured those chemicals, a fact that only later came out while he was being questioned by a reporter.

Dourson was also involved in helping set West Virginia’s “safe” level in drinking water for a chemical used to manufacture Teflon (Perfluorooctanoic acid (PFOA), or “C8”). It was 2,000 times higher than the standard set by the EPA.

In 2015, Dourson provided testimony for DuPont in the case of a woman who alleged that her kidney cancer was linked to drinking PFOA-contaminated water from the company’s Parkersburg, WV plant. Just this year, DuPont settled with plaintiffs from the Ohio Valley who had been exposed to PFOA from the same plant for $670 million after an independent C8 Science Panel made up of independent epidemiologists found “probable links” between PFOA and kidney and testicular cancer, as well as high blood pressure, thyroid disease, and pregnancy-induced hypertension.

From 2007 to 2014, Dourson and TERA also worked closely with Michael Honeycutt and the Texas Commission on Environmental Quality to loosen two-thirds of the already weak protections for 45 different chemicals.

The list of toxic decisions made by Dourson and his team goes on and clearly makes him an unacceptable choice for a leadership role at the agency charged with protecting public health and the environment.

A worst-case choice

During Dourson’s hearing before the Senate Committion on Environment and Public Works (EPW), his answers to questions about recusing himself from decisions regarding former chemicals on which TERA has worked closely with industry were cagey at best. It appears that because much of his work was through the University of Cincinnati, he will not be expected to recuse himself from decisions about chemicals that his research team was paid by industry to assess in the past. His ethics agreement confirms this. So much for the Trump administration’s draining of the swamp.

If Dourson is confirmed, I have no doubt that his appointment will be repeatedly cited as a worst-case-scenario of the revolving door between industry representatives and the government.

His work over the past few decades has been destructive enough, even from a position with little power to help the chemical industry directly skirt tough regulations. Putting him in charge of the office that is supposed to protect the public from toxins would be a grave mistake with national ramifications.

In the coming years, Dourson’s office will be making decisions about safety thresholds and key regulatory actions on chlorpyrifos, neonicotinoids, flame retardants, asbestos, and the other priority chemicals under the Toxic Substances Control Act. There is no room for error, and unfortunately, error is likely with someone like Dourson in charge.

We join with many other members of the scientific community to oppose Michael Dourson for Assistant Administrator of OCSPP and to ask senators to vote no in upcoming committee and confirmation votes.

Pruitt Steps Up His Attack on Biofuel Policies

UCS Blog - The Equation (text only) -

Molecular biologist Z. Lewis Liu (foreground) and technician Scott Weber add a new yeast strain to a corncob mix to test the yeast’s effectiveness in fermenting ethanol from plant sugars. Photo: U.S. Department of Agriculture (USDA) Agricultural Research Service (ARS) CC-BY-2.0 (Flickr)

It was just 6 weeks ago I last posted on how Pruitt’s EPA Undermines Cellulosic Biofuels and Transparency in Government, and I hoped to shift my attention to other topics.  But in late September, the EPA Administrator Pruitt stunned the biofuels world by releasing a rulemaking document (called a Notice of Data Availability or NODA) suggesting he planned to cut more deeply into the Renewable Fuel Standard (RFS) 2018 targets for advanced biofuels and biodiesel than had been previously indicated.

The NODA linked the changes to tariffs recently imposed on imports on soy-based biodiesel from Argentina and palm oil biodiesel from Indonesia, but citations in the NODA make it plain that this request comes directly from the oil refiners.

There are also rumors that EPA may count ethanol that is already being exported toward compliance with the standard, which would also reduce the obligations for refineries to blend ethanol or other biofuels into the fuel they sell.  Overall, these changes upend the basic understanding of the goals and requirements of the RFS and seem intended primarily to reduce costs for refineries.

UCS does not support the approach to NODA suggests.  This might seem odd, since we have been arguing against the increased use of both corn ethanol and vegetable oil based biodiesel for many years.  But while there are plenty of problems with food-based biofuels, ignoring the law and considering only how to reduce costs for oil refiners is not the way to fix them.

UCS has opposed discretionary enlargement of biodiesel mandates beyond statutory levels

Some parts of the RFS offer more benefits than others.  Cellulosic biofuels can expand biofuel production with greater climate benefits and lower environmental costs than food-based biofuels like corn ethanol and vegetable oil biodiesel.  But cellulosic biofuels have not scaled up nearly as fast as the RFS envisioned, which left the EPA to decide whether to backfill the shortfall of cellulosic biofuels with other biofuels, especially biodiesel.

Since 2012 we have argued that the EPA should not make discretionary enlargements to the advanced biofuel mandate to replace the shortfall of cellulosic fuels without careful consideration of potential unintended consequences.

Even without a discretionary enlargement, the minimum statutory levels of advanced biofuels that Congress specified in the RFS are ambitious, and are drawing heavily on available sources of vegetable oil and waste oils (called feedstocks) to make biodiesel and renewable diesel, which, as Scott Irwin and Darrel Good at FarmDocDaily have explained,  have for several years been provided the marginal gallon of biofuel to meet the mandates for conventional and advanced biofuel under the RFS.

Analysis we commissioned in 2015 and more recent analysis from the International Council on Clean Transportation suggest there are not sufficient feedstocks to support higher levels of production.  As I have explained in previous posts and a technical paper, the indirect effect of large expansion of biodiesel is to expand demand for palm oil, which has environmental harms that outweigh the benefits of offsetting diesel use in the U.S.

But we don’t support Pruitt’s effort to cut mandates below statutory levels

It might seem logical that if expanding mandates is a bad idea, then cutting them must be a good idea.  One can certainly make a logical argument that cutting the RFS advanced biofuel mandate will reduce demand for vegetable oil which could result in lower overall demand for palm oil and hence reduce deforestation in Southeast Asia. But there are two big problems with this approach.

First, what Pruitt is proposing is clearly inconsistent with the law.  Despite repeated claims that he will follow the law, the administrator’s actions are subverting the basic goal of the Renewable Fuel Standard, which is to expand the market for biofuels.

Until Congress updates it, the Renewable Fuel Standard is the law, and UCS’ input to the EPA has always focused on how EPA can maximize climate benefits consistent with the law.  We explained why exceeding the minimum statutory levels for food-based biofuels would have unintended consequences, but have not argued that EPA should go below these levels because this is clearly inconsistent with the law.

When corn prices spiked back in 2012, we supported a temporary RFS waiver, which was both consistent with the waiver provisions of the law and supported by the circumstances.  But today we are not facing a crisis in grain, vegetable oil or fuel markets.  Jonathan Coppess and Scott Irwin at FarmDocDaily have evaluated legal and economic grounds to waive the standard, and found no compelling case. Rather, we have a crisis in leadership – in the White House and at the EPA, where Administrator Pruitt is hostile to the basic goals of the agency he leads.  In that context, Pruitt’s proposed actions seem less like an opportunity to reduce the harms of food-based biofuels than a clear subversion of the basic goals of the law in the service of oil industry profits.

Second, political games are risky, and in the present context, climate advocates have a lot more to lose than to gain.  President Trump made repeated promises to protect ethanol, which stands in stark contrast to his position on protecting the United States from climate change.

Pruitt has been not very subtly hinting at a deal whereby the Trump administration promotes ethanol exports and treats ethanol favorably in upcoming fuel economy standards in exchange for their acquiescence to weakening the RFS.  Trading the RFS for loopholes in fuel economy standards would be a bad deal for the future of the biofuels industry and a terrible deal for the environment.

A previous loophole added to fuel economy regulations to promote ethanol sales was a failure, which ultimately did much more to increase gasoline use by making cars less efficient than to expand ethanol use.  A long-term future for the biofuels industry depends on avoiding counterproductive outcomes and helping to cut oil use, and Pruitt is clearly not headed in this direction.  While there is some similarity between UCS’s specific guidance on biodiesel targets and Pruitt’s latest pivot on the RFS, we strongly object to his approach to cellulosic biofuels, his narrow vision for the RFS that focuses solely on current fuel prices, and the direction Pruitt is taking the EPA.

Blowing up current biofuels policy is not much of a plan

Some who support climate policy espouse the idea that the RFS is a failed policy, and that it is mostly just a giveaway to agricultural interests, so letting it collapse it not much of a loss.  I disagree. The RFS is certainly shaped by the political power struggle between the oil industry and the biofuels industry/agriculture, but it also includes important environmental protections.  For example, the RFS requires that future biofuel expansion comes from advanced fuels that cut emissions at least 50% compared to gasoline.  But with the environmental goals of the policy sidelined by the hostile takeover of the EPA by Administrator Pruitt, the current battle comes down to a stark choice between working with the oil industry to undermine the basic structure of the RFS, or keeping that framework intact until we have an opportunity to meaningfully improve it

New laws generally build upon existing legal frameworks, and, if it survives, the RFS is likely to be the foundation on which future fuels policies are built.  If the RFS dies under the knife of the Pruitt EPA, the concessions the Trump administration offers the ethanol industry will not include the environmental protections in the RFS, however imperfect.  Moreover, the RFS and state fuel policies support one another, and if the RFS is weakened it will make the California and Oregon clean fuel policies more challenging and expensive.

UCS is not lending our support to Pruitt’s lawless approach to rewriting our vehicle and fuel policies.  Instead we will defend existing laws and build upon them once we have an administrator who understands that the core mission of the Environmental Protection Agency is to protect the environment rather than doing the bidding of the oil industry and other polluters.

New UCS Report Finds High Health Risks in Delaware Communities from Toxic Pollution

UCS Blog - The Equation (text only) -

Refineries, such as the Delaware City Refinery shown here, can emit toxic chemicals that can increase risks for cancer and respiratory disease.

For decades residents of communities in Wilmington, Delaware’s industrial corridor have dealt with high levels of pollution. People in these communities, which have higher percentages of people of color and/or higher poverty levels than the Delaware average, are also grappling with health challenges that are linked to, or worsened by, exposure to pollution, such as strokes, heart diseases, sudden infant death syndrome, and chronic childhood illnesses such as asthma, learning disabilities, and neurological diseases. These are some of Delaware’s environmental justice communities.

To assess the potential link between environmental pollution and health impacts in these communities, the Center for Science and Democracy at UCS collaborated with the Environmental Justice Health Alliance, Delaware Concerned Residents for Environmental Justice, Community Housing and Empowerment Connections, Inc. and Coming Clean, Inc. Analysis of the following health and safety issues using Environmental Protection Agency (EPA) data were conducted:  the risk of cancer and potential for respiratory illnesses that stem from toxic outdoor air pollution; proximity of communities to industrial facilities that use large quantities of toxic, flammable, or explosive chemicals and pose a high risk of a major chemical release or catastrophic incident; proximity of communities to industrial facilities with major pollution emissions; and proximity of communities to contaminated waste sites listed in EPA’s Brownfield and Superfund programs.

The seven communities analyzed—Belvedere, Cedar Heights, Dunleith, Marshallton, Newport, Oakmont, and Southbridge—were compared to Greenville, a predominantly White and affluent community located outside the industrial corridor, and to the population of Delaware overall. The findings from this analysis have been published in a new report titled Environmental Justice for Delaware: Mitigating Toxic Pollution in New Castle County Communities.

Proximity to major pollution sources and dangerous chemical facilities

TABLE 5. Sources of Chemical Hazards and Pollution in Environmental Justice Communities Compared with
Greenville and Delaware Overall. Note: All facilities are located within 1 mile of communities.
SOURCE: Environmental Protection Agency (EPA). No date (i). EPA state combined CSV download files. Online at www.epa.gov/enviro/epastate-combined-csv-download-files, accessed May 18, 2017.

Dunleith and Oakmont have several Brownfield sites and are in close proximity to facilities releasing significant quantities of toxic chemicals into the air. Southbridge has, within its boundaries or within a one-mile radius around it, two high-risk chemical facilities, 13 large pollution-emitting industrial facilities, four Superfund sites, and 48 Brownfield sites. Southbridge is home to more than half of all Brownfields in Delaware. Cedar Heights and Newport also have several large pollution-emitting facilities within one mile as well as being close to two EPA Superfund contaminated waste sites.

Effects of toxic air pollution on cancer risks and the potential for respiratory illnesses

TABLE 2. Cancer Risks for Environmental Justice Communities Compared with Greenville and Delaware Overall
Note: Cancer risk is expressed as the incidences of cancer per million people. For the respiratory hazard index, an index value of 1 or less indicates a level of studied pollutants equal to a level the EPA has determined not to be a health concern, while a value greater than 1 indicates the potential for adverse respiratory health impacts, with increasing concern as the value increases. SOURCE: Environmental Protection Agency (EPA). 2015. 2015 National Air Toxics Assessment. Washington, DC. Online at www.epa.gov/national-air-toxics-assessment, accessed May 18, 2017.

Of the seven environmental justice communities studied, people in Marshallton face the highest cancer and respiratory health risks. Cancer and respiratory health risks there are 33 and 71 percent higher, respectively, than for the comparison community Greenville, and are 28 and 55 percent higher than for Delaware overall.

The communities of Dunleith, Oakmont, and Southbridge, whose residents are predominantly people of color and have a poverty rate approximately twice that of Delaware overall, have cancer risks 19 to 23 percent higher than for Greenville and 14 to 18 percent higher than for Delaware overall. Respiratory hazard in these three communities is 32 to 43 percent higher than for Greenville and 20 to 30 percent higher than for Delaware overall.

For Newport, Belvedere, and Cedar Heights, which have a substantial proportion of people of color and poverty rates above the Delaware average, cancer risks are 21, 15, and 12 percent higher than for Greenville, respectively, and are 16, 10, and 7 percent higher than for Delaware overall. Respiratory hazard in Newport, Belvedere, and Cedar Heights is 44, 30, and 24 percent higher than for Greenville, respectively, and 31, 18, and 13 percent higher than for Delaware overall.

Children at risk

Kenneth Dryden of the Delaware Concerned Residents for Environmental Justice and a former Southbridge resident leads a tour of toxic facilities to teach scientists and community members about the dangers of local air pollution.

Children are especially vulnerable to the effects of toxic air pollution. Particularly concerning is that seven schools within one mile of Southbridge, with a total of more than 2,200 students, are in locations with substantially higher cancer risks and potential respiratory hazards than schools in all other communities in this study.

In addition to having daily exposure to toxic pollution in the air, children in these communities are at risk of being exposed to toxic chemicals accidentally released from hazardous chemical facilities in or near their communities. For example, the John G. Leach School and Harry O. Eisenberg Elementary School near Dunleith, with a total of 661 students, are located within one mile of a high-risk chemical facility.

Achieving environmental justice for vulnerable communities

Using multiple EPA data bases, the findings of this study indicate that people in the seven communities along the Wilmington industrial corridor face a substantial potential cumulative health risk from (1) exposure to toxic air pollution, (2) their proximity to polluting industrial facilities and hazardous chemical facilities, and (3) proximity to contaminated waste sites. These health risks are substantially greater than those for residents of a wealthier and predominantly White Delaware community and for Delaware as a whole.

This research provides scientific support for what neighbors in these communities already know—that they’re unfairly facing higher health risks. We need to listen to communities and the facts and enact and enforce the rules to protect their health and safety. Environmental justice has to be a priority for these and other communities that face disproportionately high health risks from toxic pollution.

Ron White is an independent consultant providing services in the field of environmental health sciences. Mr. White currently is a Senior Fellow with the Center for Science and Democracy at the Union of Concerned Scientists, and also holds a part-time faculty appointment in the Department of Environmental Health and Engineering at the Johns Hopkins Bloomberg School of Public Health. He earned his Master of Science in Teaching degree in environmental studies from Antioch University, and a Bachelor of Arts degree in environmental science from Clark University.  

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs