UCS Blog - The Equation (text only)

EPA Eliminates Vital Protection to Keep Air Clean of Toxics, Threatening Our Health

1973 EPA photo of a smokestack of the DuPont chemical plant on the Houston Ship Channel.

For decades, the Clean Air Act has protected us from the dangerous health effects of hazardous air pollutants (HAPs). Many of these are toxic, since breathing or otherwise ingesting them can cause cancer, as well as respiratory and degenerative neurological diseases that can lead to death. Some, like chlorine and clorhydric acid, can inflame the lungs and airways. Workplace exposure to styrene, a solvent frequently used to manufacture plastics and synthetic rubber, is linked to degenerative diseases like multiple sclerosis and others similar to Parkinson’s Disease. Thanks to strong federal rules that protect us from 187 toxic air pollutants, the EPA estimates that we have avoided emitting 1.5 million tons of these pollutants every year since 1990.

But the EPA, recently—and quietly—eliminated these protections. To the already long list of EPA actions that have undermined public protections and the extensive conflicts of interest of many of the agency’s political appointees, we can now add a new assault on our health and environment: the withdrawal of a long-standing policy known as “once in, always in” (OIAI). The change was made with no public comment and a muted announcement of a “reinterpretation” of the law. Eliminating OIAI will allow major sources of hazardous air pollutants like mining smelters and petrochemical manufacturing to discontinue the use of the maximum achievable control technologies (known as “MACT”) to control toxic air emissions.

In a previous post before UCS completed our analysis of potential impacts, I warned that the rule change would increase cancer-causing toxic air pollutant emissions. My colleague Dr. Gretchen Goldman also explained how environmental justice (EJ) communities—typically low-income communities of color already overburdened with environmental hazards in their neighborhoods—would be the most affected by this rule change. Sure enough, in our study we did in fact find that many of the communities where there are already high levels of toxic contamination will become even more exposed.

Let’s take the communities of Galena Park and Manchester along the shipping channel in Houston, TX as an example. Together with our partners—residents of these communities and activists of the EJ organization TEJAS, we showed some time ago that their health and livelihoods are already under assault due to the clustering of multiple industrial facilities that emit large quantities of toxic air pollutants. Legislative District 29 (TX-29), where these communities are located, currently has 15 facilities that have kept emissions under 25 tons per year through the use of MACT; eleven of these facilities could emit an additional 205 tons per year of toxic air pollutants as a result of the EPA’s new policy, which represents an increase of nearly 70 percent!

Some major sources of hazardous air pollutants such as the Deer Park chemical manufacturing in Houston, TX (a subsidiary of OxyChem), could increase its emissions of hazardous air pollutants from 0.64 to nearly 25 tons per year if it stops using MACT to control emissions.

The EPA’s new guidance will affect states very differently because some states have their own stringent toxic emissions limits, while others follow federal guidelines. The 21 states that only follow federal guidelines will likely be most affected by the new guidance as shown in our web feature. But all states rely on the federal designation to some degree so this problem will occur across the country.

How can you find out how impacted your area may be? You can check out a map we created showing how many facilities in your congressional district could increase toxic emissions. For example, if you click on Montana’s At-Large congressional district, a pop-up window will reveal that 9 out of 17 MACT-subject facilities in the district could emit 212 tons per year of hazardous pollutants, and that the state does not have any additional protections in place to limit toxics. If you scroll the pop-up a bit further down, you will find the name and office phone number of that district’s congressional representative. We encourage you to contact your congressional representative and ask them how they will demand that the EPA and your state’s environmental agency will protect public health from this dangerous new change.

Take action

There are many ways you can speak up about the concerns you have about the potential for the facilities in your community to release hazardous air pollutants due to this reduction in public protections.

  • If you live in a state where toxic air pollution might increase, push your state legislators to enact stronger state-level laws to protect your community from toxic air pollutants. Below are some ways to engage – consult these tips on communicating with policymakers.
  • Inquire with your state air agency how your area might be affected by the changes to the “once in, always in” mandate. Find your state agency on the EPA site.
  • Utilize the media to bring attention to the issue. Write a letter to the editor, Op-Ed, or meet with local journalists or editorial boards about your concerns. See these tips on best practices.
  • Directly contact the company that operates the facility near you, and ask them to commit to maintaining their classification and use of MACT technology and requirements. Follow up if you don’t hear back in a week and utilize their response (or lack thereof) to hold them accountable for their actions, and/or to show the media, your policymakers, and the public what they are saying.
  • Tell EPA Administrator Scott Pruitt to do his job of carrying out the EPA’s mission of protecting public health and the environment by rescinding the new guidance.
    • Tweet at EPA Administrator Scott Pruitt, the EPA, and tag your members of congress.

Want to receive the latest information on federal attacks to our health, safety, and environmental protections, and customized opportunities for you to push back and advocate for science? If you’re a scientist or technical expert, you can do so by joining the Science Network’s watchdogging initiative. If you’re an advocate within your local community, join the cadre of Science Champions.

Science — The Hidden Gem at the Heart of the EPA and Why You Should Support It

Photo: skynesher/iStockphoto

The role of science in EPA decision-making might, in the vocabulary of former President George W. Bush, be the most “misunderestimated” part of the EPA’s job. Although their work is the foundation of virtually every EPA decision—from regulatory protections to reviews of new chemicals to Superfund cleanups—agency scientists have labored for years under the radar. Career and political professionals appreciate and routinely rely on their work, but their invaluable contributions remain largely invisible.

Not any longer. Scott Pruitt’s obvious distaste for science has pushed EPA science into the headlines. We’ve gone from a norm in which a top-ranking EPA science policy figure was on virtual speed dial to the administrator to a Pruitt-era multi-faceted attack on the scientists themselves. Initially Pruitt ignored them, then tried to defund them, and is now attempting to hobble their work.

An attempt to slash the EPA’s Science and Technology Account

Let’s start with the 2019 Trump/Pruitt proposed budget, which is virtually identical to their 2018 proposal that was fortunately soundly rejected by the Congress. The 2019 budget would cut the EPA’s Science and Technology budget by 49%. The specifics—all programs funded by the S&T Account that President Trump proposes cutting—illustrate the insult to the health and safety of the American public.

  • A 67% cut to the “Air and Energy” account that looks at how air pollution damages our health and well-being. This is essential information as the EPA decides which pollutants must be reduced and at what levels, and prepares the country to respond to climate change. The Trump budget would ax these programs despite analysis showing that since 1990 the American public has reaped clean air benefits to the tune of $2 trillion compared to estimated costs of $65 billion. Need one good example? This program helps advance the development and use of lower-cost, portable, and user-friendly monitoring devices individuals can use in their own communities to find out what they and their families are breathing. Why cripple one of the EPA’s biggest public health success stories when we actually need to be investing more in our clean air?
  • A 37% cut in “Safe and Sustainable Water Resources” to protect the lakes, streams, and rivers across our country from which we get drinking water and where we fish, swim, and boat. This research is an essential part of making sure water bodies are healthy, that valuable water isn’t overwhelmed by pollution from factories and other industrial processes. You only have to look at the role of EPA scientists in monitoring and evaluating algae blooms in Toledo, Ohio that endangered drinking water for millions of people, or Flint, Michigan’s problems with lead in drinking water to understand why this account should be funded at even greater levels than it is today.
  • A 61% cut in “Research on Sustainable Communities.” Cities and states across the country rely on the research and planning tools developed in this program as they go about their jobs assuring good environmental and health outcomes. This research also develops and demonstrates new and improved techniques for environmental protection. For example, program researchers found a way to estimate how drinking water, food, dust, soil and air contribute to blood lead levels in infants and young children.
  • A 34% cut in “Research on Chemical Safety and Sustainability” to evaluate how thousands of chemicals, existing and under development, might affect people’s health and the environment. This research allows the EPA to develop the scientific knowledge, tools and models to conduct integrated, timely, and efficient chemical evaluations. Getting a bit wonky here—because I was personally involved in this effort and watched it grow from a concept to being the international leader in innovating approaches to chemical hazard: this account supports the EPA’s work in computational toxicology, which in turn helps the EPA take on the herculean task assigned it by the 2016 Lautenberg amendments to the bipartisan Toxic Substances Control Act to analyze possibly thousands of chemicals for potential risk. The tool integrates knowledge from biology, biotechnology, chemistry, and computer science to identify important biological processes that may be disrupted by the chemicals and thereby sets priorities for their review based on potential human health risks. Risking that program should be a non-starter.
  • Completely eliminating Science to Achieve Results (STAR) Grants that support outside researchers for cutting edge work in all of the areas above.
Political takeover of science

Scott Pruitt’s most recent attack on science at the EPA is a back-door attempt to institutionalize a very damaging idea that has failed to be enacted by Congress over multiple years. Pruitt is trying to railroad through a regulation that would throw out scientific studies used in setting EPA rules and other requirements unless the raw data on which the studies are based are made publicly available.

Why is this a terrible idea and threatening to public health? For five decades, EPA’s regulatory protections have relied on many thousands of health-related studies of pollutants, including epidemiological, human, and animal studies. Many examine the relationship between concentrations of various pollutants and their impacts on people’s health.

The raw data on which these studies are based often includes names, dates of birth and death, health, lifestyle information, and subjects’ locations—data that is personally damaging if released and has almost nothing to do with public understanding or the validity of the study’s results. Ethical and legal considerations rightly keep scientists from releasing such personal data. Restricting the use of the data cuts two ways, as often it is submitted by industry in support of its activities as well by groups that are arguing for more stringent regulatory controls. There are certainly ways to confirm independently the validity of the studies as has been done with two keystone air quality studies by the Health Effects Institute (HEI), an organization jointly funded by EPA and the industry.

The Pruitt proposal creates other problems as well: so much time has passed since the leading studies of the impacts of air pollution were compiled in the early 1990s that it would be logistically difficult to retrieve and redact all of the underlying data; this would effectively prevent the use of the most authoritative data available on the impacts of air pollution.

Put in plainer language, Pruitt’s latest attack on science is not good for anyone—industry or the general public.

Attacks on science hit locally

There are many practical, local examples of why cutting science funding is so pernicious and bad for everyone. The EPA, for example, plays a role in the cleanup of Anchorage, Alaska’s Elmendorf Air Force Base and for that had the assistance of another federal body, the Agency for Toxic Substances and Disease Registry (ATSDR), which examined blood samples taken from residents at the base. ATSDR concluded on the basis of science that lead exposure there did not pose a health hazard. There is no way to make such determinations without science and data, in this case medical data such as blood sample test results, which are critical in drawing valid conclusions as to whether regulated facilities, such as Superfund cleanup sites, cause health effects in nearby communities.

Another notable example is the remote native village of Kivalina in Northwestern Alaska, which is downstream from Red Dog Mine. Unsurprisingly, Kivalina residents are worried about how mining activities might threaten their health by contaminating subsistence foods from their hunting and gathering activities. Personal medical data taken from Kivalina residents was analyzed to determine that Kivalina residents and their food are unlikely to be at risk. The same form of analysis of environmental impacts has been used at other sites such as the long-running cleanup efforts of asbestos contamination in Libby, Montana.

The EPA’s seminal achievements over almost 50 years include removing lead from gasoline; reducing acid rain to improve water quality; reducing second-hand smoke exposure; improving vehicle efficiency and emission controls; and encouraging a shift to rethinking of wastes as materials.

Evaluating and acting on science—the best available science—and having the funding to ensure science expertise is on tap at the EPA is the linchpin to any one of these. Congress and the American public should tell Pruitt to back off from his attacks on EPA science and make sure the agency has the funding it needs to do its job.

Robert Kavlock is the former Acting Assistant Administrator for the EPA Office of Research and Development and an EPA Science Advisor (retired). He is currently a member of the Environmental Protection Network, a nonprofit organization of EPA alumni working to protect the agency’s progress toward clean air, water, land and climate protections.

What Happened During the Hasty White House Review of EPA’s Science Restriction Rule?

We already know that the production of Administrator Scott Pruitt’s rule to restrict science at the EPA was purely political, but it’s possible that there’s a whole new layer of politics that went on at the White House level as well.

Source: reginfo.gov

On Thursday April 19, the White House Office of Management and Budget’s (OMB) Office of Information and Regulatory Affairs (OIRA) received a draft proposed rule from the EPA titled “Strengthening Transparency and Validity in Regulatory Science.” It was signed promptly by Administrator Pruitt on Tuesday. At first, the OMB’s website showed its review completion on Wednesday (meaning that Pruitt had signed the document before it was cleared by the White House), but then later in the week OMB backdated its review completion date to Monday. That means that not only is there likely some funny business going in between the EPA and OIRA, but OIRA had four days (and little more than one to two full work days) to review a proposed rule that would dramatically impact the way that EPA uses science in future rulemakings.

In just a couple days of OIRA review, a UCS analysis of the rule before and after review shows that it grew by four pages and was narrowed to include rules considered to be “significant regulatory actions” and those with dose response data and models that underlie “pivotal regulatory science.” While the docket does not currently include details on who made those changes, if OIRA staff was responsible for changing the scientific basis of this rule, there is certainly reason to be concerned. White House review under Executive Order 12866 is supposed to be limited to cost benefit analysis and overlap with other agencies and should in no way change the scientific content of the agency’s work. Interference from the White House in this area doubles down on the already implicit affront to scientific integrity at EPA that this rule represents.

We compared the start and conclusion documents from OIRA’s EO 128666 review of EPA’s science restriction policy and noticed that post-review changes (those in blue) included narrowing its scope to cover the dose response data and models underlying “pivotal regulatory science.” Source: regulations.gov

 

The policy post-OIRA review also included definitions for “dose response data and models” and “pivotal regulatory science.” Source: regulations.gov

Not only are there questions about OIRA’s role in changing the content of the rule, but the rule’s mad dash through White House review is not normal even by Administrator Pruitt’s standards. OIRA review of proposed rulemakings, required under President Bill Clinton’s Executive Order 12866, is supposed to take under 90 days with the possibility to extend to 120 days if it is absolutely needed. If we operate under the assumption that a 90-day review is an adequate amount of time for OIRA to review a rule, make sure the costs and benefits have been thoroughly analyzed, allow time for interagency review and meetings with stakeholders, and then suggest changes to the agency, exactly how inadequate is a 1 to 2 day review? According to OIRA’s regulatory review data, since the time that Pruitt has been at EPA, the agency has reviewed 41 rules that were not economically significant (including the policy in question). The average review time for those rules? 52 days. In fact, only 6 other rules have gone through review in less than a week at the EPA in this period, several of which were addendums to rules (like definitions, delays, or stays).

So what’s the problem with such a quick turnaround from the White House?

UCS has in the past taken issue with extensive delays in OIRA review, especially under the Obama administration, that have held up important science-based public health protections in regulatory limbo. While an overly long OIRA review period bogs down the regulatory process, a dramatically swift review process may allow rules to be proposed without the proper analysis to back it up. This is precisely what we’re now seeing with the EPA’s proposal to restrict science. In it, the EPA claims it’s not an economically significant rule, citing no analysis on the potential costs and benefits of the rule. It calls for a system to make scientific data publicly available but cites no existent database that would be able to handle all of EPA’s “pivotal regulatory science.” It does not include protections for privacy for confidential business information and it gives the EPA administrator the ability to waive rules from the requirements on a case by case basis. And finally, it reveals just how feeble the rule is by posing 25 substantive questions (almost four pages-worth) to commenters for them to answer in 30 days— the shortest comment period window possible for a rule.

UCS has submitted a comment to the EPA asking for an extension to this woefully insufficient comment period for such a sweeping rule and we are joined by many other organizations who are doing the same. House Science Committee Ranking Member Eddie Bernice Johnson and Energy & Commerce Environment Subcommittee Ranking Member Tonko were joined by 63 democratic colleagues on a letter calling for a 90-day comment period because “regardless of viewpoint, there is agreement that the proposed rule would be a significant change in how the agency considers science in policymaking.” It is imperative that Administrator Pruitt heeds this call to ensure all stakeholders have a chance to meaningfully participate in this process, which has been bungled in a variety of ways since this rule began as just a twinkle in Lamar Smith’s eye.

Photo: Matthew Platt/CC BY-SA (Flickr)

What is the Connection Between New Mobility and Transportation Equity?

My name is Richard Ezike, and I work at the interface between new mobility and transportation equity. When I talk about “new mobility” in my research, I refer to what is arguably the most disruptive technology in transportation in the last century: the autonomous vehicle (AV). Already these cars are being tested on America’s roadways in Chandler, Arizona; Pittsburgh, Pennsylvania; and Silicon Valley, Companies like Uber, Lyft, Waymo, Ford, and General Motors are investing billions of dollars to bring this technology quickly to market. These companies are touting widespread adoption in less than 5-10 years.

However, more discussion is needed on the impacts of these cars on transportation equity because this nexus is often ignored in the spaces where AVs are being debated and discussed. The million-dollar question is: Will AVs help or hurt the mobility of low-income people and people of color? The pursuit to tackle that question has led me here to the Union of Concerned Scientists (UCS).

My project works to address this question from two angles. First, we are working with a transportation consulting firm to study the potential impact of self-driving technology on access, equity, congestion, and transit utilization in the DC Metro Area, where I personally live and work.  They are using a travel demand model developed by the area metropolitan planning organization (MPO), the National Capital Region Transportation Planning Board, to predict the impacts of vehicle miles traveled, vehicle trips, and transit trips by AVs in 2040. By modifying the inputs to the model, we can simulate the impacts of self-driving cars on the future transportation network performance. The detailed nature of the model allows us insight into specific neighborhoods that may gain or lose under a variety of future scenarios.

Second, we are engaging stakeholders to learn their thoughts and concerns about AVs. To date, I have interviewed over 40 stakeholders including local government officials, car dealers, community leaders, and policy makers. I have asked them about the potential impacts of AVs on traffic, labor, the environment, and the economy. In early 2019, we plan to convene stakeholders to discuss our research findings, get feedback, and generate policy recommendations to share with local leaders and community groups.

Using this two-pronged approach will provide our community with both technical and community-based knowledge that will assist in the planning of how AVs can be deployed safely and equitably.

Defining transportation equity

Historically, members from disadvantaged groups (low-income residents, minorities, children, persons with disabilities, and older adults) have experienced the most negative impacts of the transportation system. These groups have lower car ownership levels, the longest commute times and the highest costs for transportation. These same groups also live near inadequate infrastructure, which results in unsafe conditions for cycling and walking and therefore an increased number of fatalities involving pedestrians and cyclists.

Low-income and minority communities are also more likely to be located near highways and other transportation facilities that produce local air pollution; to suffer from negative health effects such as asthma; and to have the least accessibility to key destinations such as parks, hospitals, and grocery stores selling healthy food. Addressing these issues requires a dedicated effort to address equity in the transportation system to provide equal access for all people.

Equity is defined as the fairness, impartiality, and justness of outcomes, including how positive and negative impacts are distributed. Within transportation and infrastructure, the decisions made in the planning stages can significantly affect the level of equity achieved in communities.

Depending on how it is deployed, autonomous vehicle technology could improve transportation inequities; but without guidance, the same detrimental effects to disadvantaged groups may only get worse. Moreover, solving these problems is not a purely technical challenge, but requires meaningful engagement and input from communities with a stake in the outcomes, so they can have a voice in the way their city is developed. Historically, public engagement has been a secondary consideration, although many MPOs are stepping up their efforts. Based on work by Dr. Alex Karner, effective engagement can be broken into three steps:

  1. Identify current unmet needs from the communities this requires engaging with community groups to learn how MPOs can best serve residents.
  2. Provide funds to assist community groups in engagement – Engagement can be time consuming and expensive, and often community groups do not have the bandwidth in time or in funding for outreach. Therefore, the MPOs should provide resources to assist. Karner suggested raising money through state taxes or allocating from available transportation funds
  3. Measure progress of outreach using relevant metrics – MPOs must track how effectively they are engaging communities. They need to know how many people they talked with and if they understood the material being discussed. By tracking that information, MPOs will know if their message is getting across.

Through the duration of my fellowship I have had the opportunity to interview several stakeholders to learn about how they see autonomous vehicles impacting equity. Across the board, there is a definite interest in how the broad impacts of AVs will manifest themselves in society, and at UCS my research will help to bring these various groups together. My engagement with these groups is helping to identify unmet needs, identify relevant metrics from stakeholders, and stress the importance of safe and equitable AV deployment. 

Why new mobility and equity must function together

I have talked with transit advocates who are concerned about the impacts on transit agency jobs and public transit options in general, as they are concerned that AVs will replace public transit but may not meet the needs of transit dependent communities while eliminating thousands of transit worker jobs.

I have spoken to business owners who believe the benefits of autonomous vehicles, such as increased access to the transportation system for the disabled and senior citizens, outweigh any potential pitfalls.  I have heard varying viewpoints from several local government officials from very concerned to “we have not thought about AVs yet,” and some state departments of transportation are taking a hands-off approach.

These discussions reiterate that the paradigm shift is happening. Autonomous car technology is here, and billions of dollars are being spent to put these cars on the roads as fast as possible. However, the conversations that are most needed –potential impacts on transportation equity and accessibility, the effects on public transit, and the environmental considerations – are not happening quickly enough. They need to happen more often, and soon. Through my fellowship at UCS, I aim to increase this awareness and provide new research, analysis and recommendations to advance equitable transportation outcomes.

Regulators Should Think Twice Before Handing Out Pollution Credits for Self-Driving Cars

A new report out by Securing America’s Future Energy (SAFE) suggests that automakers should get credits towards meeting emission and fuel economy standards for connected and automated vehicles (AVs) and related advanced driver assist systems—technologies that may or may not save any fuel. Doing so would not only increase pollution and fuel use, but would seriously undermine the integrity and enforceability of regulations that have delivered enormous benefits to our environment, our pocketbooks, and our national security.  The tens of thousands of traffic related fatalities every year in the U.S. demands that automakers and regulators must continue to make our cars safer.  But trying to encourage greater deployment of safety technologies by undermining pollution standards is the wrong approach.

Here’s why regulators should reject giving emissions credits to manufacturers for deploying safety and self-driving technologies.

Including emissions credits for safety and self-driving technologies in 2022-2025 vehicle standards would be a windfall for automakers, resulting in less deployment of proven efficiency technologies and more pollution.

There are more questions than answers about the potential impacts of various safety technologies and self-driving capabilities on vehicle and overall transportation system emissions, which I’ll get into more below.  But for now, let’s just take a big leap of faith and assume that some safety technologies actually do lower an individual vehicle’s emissions.

One example is adaptive cruise control.  This technology automatically adapts a vehicle’s speed to keep a safe distance from a vehicle ahead and theoretically could perform more efficiently than a human driver.  It is widely available and featured on vehicles like the Toyota Camry, Honda Accord and Ford Fusion.  One study examined this technology and found changes in efficiency could range from +3 to -5 percent during various types of driving. While there is some evidence that under certain conditions there might be a slight fuel economy benefit from this technology when it is in use, that same evidence indicates that increased fuel use and emissions are also possible.

In another recent study of self-driving cars, researchers found that while eco-driving capabilities could potentially provide savings, the increase in electric power demand, added weight, and aerodynamic impacts of sensors and computers would increase fuel use and emissions.  Both of these examples demonstrate the importance of testing and verifying any assumed change in emissions from the deployment of safety and self-driving technology as emissions reductions are anything but certain.

But even if credible testing and data were available, giving off-cycle credits for this technology within existing standards would be a giveaway to the auto industry.

Why? Adaptive cruise control is already being deployed on millions of cars – 1 in 5 new vehicles produced for the US market in model year 2017 were equipped with adaptive cruise control. Automatic emergency braking is another example, where automakers have already made commitments to make it standard on nearly all cars by 2022. Giving credits for these technologies would be a windfall for manufacturers and result in less deployment of proven fuel efficiency technologies.

The ICCT also identified this issue of providing credits for tech deployment that is already occurring in their review of the current off-cycle credit program and concluded that the program greatly reduces the deployment of other efficiency technology. They also identified the lack of empirical evidence to validate claimed fuel economy and emissions benefits from several technologies already included in the program as another big problem. And currently there is little empirical data to validate any efficiency benefits of safety and self-driving technologies.

Providing credits for emissions and fuel consumption impacts that are difficult to measure and not directly related to a vehicle – like possible impacts on traffic congestion—would increase pollution and undermine the standards.

Expanding the off-cycle program for safety technologies that might directly impact a vehicle’s emissions is just the tip of the iceberg.   The off-cycle credit program, like the vehicle standards in general, is limited to emissions directly related to the performance of a vehicle. But some automakers, and SAFE, are interested in allowing credits based on potential changes in emissions from the transportation system as whole. For example, automakers could earn credits toward compliance with vehicle standards for some future changes in traffic congestion that might result from the deployment of improved vehicle safety technologies. This would be a major change to the per-vehicle basis of the fuel economy regulations that were established in the 1970’s.

There are several serious problems with including speculative, indirect emissions impacts in existing vehicle standards.

1. Providing credits for emissions reductions that may or may not ever happen in the future will increase pollution in the short term and may never result in emission reductions in the long term

We only need to look back at the flex fuel vehicle (FFV) loophole to find an example of this kind of failed policy. Automakers were given fuel economy credits for selling cars capable of running on fuel that is 85 percent ethanol (known as E85), under the theory that this would help drive E85 to market and we would use less oil. Several automakers used it as a compliance strategy and avoided investing in other fuel efficiency technologies. But the cars almost never actually used E85, which means instead of getting more efficient vehicles, we got more oil use. The increased fuel consumption resulting from the FFV loophole is estimated to be in the billions of gallons.

Crediting future emissions reductions based on hopes and dreams has been tried before and doesn’t work.

2. Ignoring the potential negative impacts from self-driving technologies is a HUGE problem.

Self-driving cars have the potential for both positive AND negative impacts on pollution and energy use.

The biggest X-factor is how drivers will respond to these new technologies, which make vehicles safer, but also makes them easier to drive (or not drive at all as the case may be). A paper by Wadud et. al examined a range of direct and indirect impacts self-driving vehicles could potentially have on emissions.  And there are several possibilities, some of which could reduce emissions while others could increase emissions dramatically (see figure).   Increased emissions could result from higher highways speeds enabled by increased vehicle safety, increased vehicle size or features as drivers expect more features in their vehicles while their car drives them around, and most importantly, increases in the amount of vehicle travel overall.  Combined, these effects could increase emissions by more than 100% according to the study.

Automated vehicles could have both positive and negative impacts on energy consumption and emissions. Wadud et al.

We’ve already experienced increased highway speeds as vehicles have become safer with seatbelts, air bags and a host of other safety technologies.  And it’s not hard to imagine increases in vehicle miles traveled as cars take over the task of driving so we can do other things.  Just think about for a minute—what different choices might you make if you didn’t have to drive your own car?  Living farther from work or taking that extra trip during Friday rush hour might not seem so bad anymore when you can read a book or watch a movie while your car chauffeurs you to wherever you want to go.

Based on the current scientific literature, SAFE’s estimate of potential efficiency improvements from automated vehicles is misleading at best. Their analysis ignores any possible disbenefits, like increased vehicle travel, even while specifically acknowledging AVs “can also give drivers one thing of tremendous value to most Americans – an increase in personal or productive time”. The analysis also uses the upper range of efficiency benefits from a handful of studies estimated over limited driving situations, and inappropriately applies them to all driving.  The conclusion that a handful of safety technologies could reduce emissions 18-25%  across the entire vehicle fleet is not supported by current evidence, ignores any other effects of self-driving cars, and is not a sound basis for policymaking decisions.

My point isn’t that we should prevent self-driving technology and the many potential benefits it could deliver if done responsibly.

But vehicle standards aimed at reducing emissions and fuel consumption shouldn’t include credits for potential positive changes to transportation system emissions while ignoring the negative ones.

3. Finally, regulatory enforceability and accountability—the key to the success of today’s vehicle standards—would be severely undermined

The effectiveness of vehicle standards, any standards for that matter, is having effective enforcement which ensures regulated entities are all participating on a level playing field and that the actual benefits of the standards are realized.  We’ve seen the importance of enforcement over the decades as automakers have been held accountable for the performance of their products. Think ‘VW diesel scandal’ for one, and the numerous examples of erroneous fuel economy labels (Ford and Hyundai-Kia to name just two). These enforcement actions have one important thing in common: regulators were able to perform tests on the vehicles to determine if they were performing as the automakers claimed, and demonstrate that they were not.

Current vehicle standards are robust because they are predicated on direct emissions and fuel savings benefits that are verifiable on a vehicle level. An automaker makes a car, it’s tested, and they are held accountable for the results. How might a regulator, or an automaker, test and verify the congestion impacts of an individual Cadillac STS with Super Cruise?

Providing credits to automakers for emission reduction benefits that cannot be verified or attributed to an individual manufacturer, nevermind an individual vehicle make or model would be a massive change in approach to the program introduced through a mechanism – the off-cycle credit provisions – which was never intended to be more than small part of automaker compliance.

Where’s our insurance policy?

SAFE makes the case that giving away credits to automakers now, even without proof that these technologies reduce fuel use and emissions, is worth it because it would allow EPA and NHTSA to run a research program to understand the impacts on fuel economy of self-driving technology. But why should we accept increased pollution for collecting information? A better path forward for regulators is to indicate their intention to consider the direct vehicle emissions and fuel economy impacts of safety and self-driving technology in setting post-2025 vehicle standards and implement a testing program now to collect the necessary data to see whether giving credits for these technologies is appropriate. This would motivate automakers to do their own testing and to work with EPA and NHTSA to develop appropriate test procedures for ensuring the claimed benefits are actually occurring.

If safety and self-driving technology off-cycle credits are a proposed solution to the current impasse over 2022-2025 vehicle standards between federal regulators, the auto industry, and California, then we all need to be clear about the costs. They would provide windfall credits to auto companies for something they are already doing, while stalling deployment of proven efficiency technologies and increasing emissions.  If indirect changes in transportation system emissions and fuel consumption are included, such as some theoretical impacts on congestion sometime in the future that may or may not happen, the move would risk undermining the foundation of the standards themselves.

We should not be forced to make a choice between improving vehicle safety and reducing emissions. We need to protect the public from vehicle crashes and protect the public from pollution. If there is proven safety technology that is saving lives, automakers should deploy it and safety regulators should require it. But moving from a regulatory structure that is built on verifiable and enforceable emission reductions to one that is based on speculation and indirect impacts is a dangerous move that should be avoided.

 

The Health and Safety of America’s Workers Is at Risk

Saturday, April 28, may have seemed like just another Saturday. Some of us likely slept a little later and then got on to those household chores and tasks we couldn’t get to during the week. Some of us enjoyed some leisure time with family and friends. Many of us got up and went to work—maybe even to a second or third job.

But April 28 is not just another day. Here in the US and around the world, it’s Workers’ Memorial Day—the day each year that recognizes, commemorates and honors workers who have suffered and died of work-related injuries and illnesses. It is also a day to renew the fight for safe workplaces. Because too many workers lose their lives, their health, their livelihoods or their ability to fully engage in the routine activities of daily living because of hazards, exposures and unsafe conditions at work.

Unless you know someone who was killed or seriously injured on the job, you probably don’t give workplace safety much thought. Perhaps you think work-related deaths, injuries and illnesses are infrequent, or only affect workers in demonstrably risky jobs—like mining or construction. The actual statistics, however, tell a different story. (For a more detailed and visual look, see this Bureau of Labor Statistics [BLS] charts package.)

Fatalities: In 2016 the number of recorded fatal work injuries was 5,190. On average, that’s 14 people dying every day. In the United States. It’s also 7 percent more than the number of fatal injuries reported in 2015 and the highest since 2008. Most of these deaths were the result of events involving transportation, workplace violence, falls, equipment, toxic exposures, and explosions. And the 2016 data reveal increases in all but one of these event categories. That’s not going in the right direction.

Non-fatal cases: According to the BLS, private industry employers reported 2.9 million non-fatal workplace injuries and illnesses in 2016, nearly one third of which were serious enough to result in days away from work—the median being 8 days. For public sector workers, state and local governments reported another 752,600 non-fatal injuries and illnesses for 2016.

Costs: And then there’s the enormous economic toll that these events exact on workers, their families and their employers. According to 2017 Liberty Mutual Workplace Safety Index, the most serious workplace injuries cost US companies nearly $60 billion per year.

But that’s just a drop in the bucket. The National Safety Council estimates the larger economic costs of fatal and non-fatal work injuries in 2015 at $142.5 billion. Lost time estimates are similarly staggering: 99 million production days lost in 2015 due to work injuries (65 million of which occurred in 2015), with 50 million estimated days lost in future years due to on-the-job deaths and permanently disabling injuries that occurred in 2015.

And even these costs don’t come close to revealing the true burden, as they do not include the costs of care and losses due to occupational illness and disease. A noteworthy and widely cited 2011 study estimated the number of fatal and non-fatal occupational illnesses in 2007 at more than 53,000 and nearly 427,000, respectively, with cost estimates of $46 billion and $12 billion, respectively.

Who foots the bill and bears these enormous costs? Primarily injured workers, their families, and tax-payer supported safety net programs. Workers’ compensation programs cover only a fraction. See more here and here.

The other part of the story

As sobering as these data and statistics are, they tell only part of the story; the true burden of occupational injury and illness is far higher. Numerous studies find significant under-reporting of workplace injuries and illnesses (see hereherehereherehere). Reporting of occupational disease is particularly fraught, as many if not most physicians are not trained to recognize or even inquire about the hazards and exposures their patients may have encountered on their jobs.

Nor do the statistics reveal the horror, loss, pain, and suffering these injuries and diseases entail. In the words of Dr. Irving Selikoff, a tireless physician advocate for worker health and safety, “Statistics are people with the tears wiped away.”

Just imagine having to deal with the knowledge that a loved one was suffocated in a trench collapse; asphyxiated by chemical fumes; shot during a workplace robbery; seriously injured while working with a violent patient or client; killed or injured from a fall or a scaffolding collapse; or living with an amputation caused by an unguarded machine.

Or the heartache of watching a loved one who literally can’t catch a breath because of work-related respiratory disease. Or is incapacitated by a serious musculoskeletal injury. Or has contracted hepatitis B or HIV because of exposure to a blood-borne pathogen at work.

And here’s the kicker: virtually all work-related injuries and illnesses are preventable. There’s no shortage of proven technologies, strategies and approaches to preventing them. From redesign, substitution and engineering solutions that eliminate or otherwise control hazards and exposures to safety management systems, worker training programs, protective equipment, and medical screening and surveillance programs, there are multiple paths to prevention. And, as a former assistant secretary of labor for occupational safety and health, David Michaels, recently wrote in Harvard Business Review, safety management and operational excellence are intimately linked.

Historic progress now at risk

The Good News: It’s important to note and remember that workplace health and safety in the US is a lot better than it used to be before Congress enacted the Occupational Safety and Health Act of 1970, and even since 2000. This progress has resulted large measure from the struggles of labor unions and working people, along with the efforts of federal and state agencies. Workplace fatalities and injuries have declined significantly, and exposures to toxic chemicals have been reduced.

It is also a testament to the effectiveness of health and safety regulations and science-based research. We can thank the Occupational Safety and Health Administration (OSHA), the Mine Safety and Health Administration (MSHA), and the National Institute for Occupational Safety and Health (NIOSH) for many of these protections and safeguards. We must also acknowledge and thank the persistence, energy, and efforts of the workers, unions, researchers, and advocates that have pushed these agencies along the way.

The Red Flags: There are numerous indications that this progress will be slowed or even reversed by a Trump administration intent on rolling back public protections and prioritizing industry interests over the public interest. For example:

  • Right off the bat, the president issued his two-for-one executive order requiring agencies to rescind two regulations for each new one they propose. So, to enact new worker health and safety protections, two others would have to go.
  • OSHA has delayed implementation or enforcement of several worker protection rules that address serious health risks and were years in the making—i.e., silica, the cause of an irreversible and debilitating lung disease, and beryllium, a carcinogen and also the source of a devastating lung disease.
  • OSHA has left five advisory and committees to languish—the Advisory Committee on Construction Safety and Health; the Whistleblower Protection Advisory Committee; the National Advisory Committee on Occupational Safety and Health; the Federal Advisory Council; and the Maritime Advisory Committee—thus depriving the agency of advice from independent experts and key stakeholders.  Earlier this week, a number of groups, including the Union of Concerned Scientists, sent a letter to Secretary of Labor Acosta asking him to stop sidelining the advice of independent experts.
  • President Trump signed a resolution that permanently removed the ability of OSHA to cite employers with a pattern of record keeping violations related to workplace injuries and illnesses. Yes, permanentlybecause it was passed under the Congressional Review ActAnd Secretary Acosta recently seemed hesitant to commit not to rescind OSHA’s rule to improve electronic recordkeeping of work-related injuries and illnesses.
  • Having failed in efforts to cut some worker health and safety protections and research in his FY18 budget proposal, the president is going at it again with his FY19 proposal. He is calling for the elimination of the US Chemical Safety and Hazard Investigation Board and OSHA’s worker safety and health training program, Susan Harwood Training Grants. There is, however, a tiny bit of good news for workers in President Trump’s proposed budget for OSHA; it includes a small (2.4 percent) increase for enforcement, as well as a 4.2 percent increase for compliance assistance. Of note, employers much prefer compliance assistance over enforcement activities.
  • The president’s budget also proposes to cut research by 40 percent at the National Institute for Occupational Safety and Health (NIOSH)—the only federal agency solely devoted to research on worker health and safety—and eliminate the agency’s educational research centers, agriculture, forestry and fishing research centers and external research programs.
  • He has also proposed taking NIOSH out of CDC, perhaps combining it later with various parts of the National Institutes of Health. Never mind that NIOSH was established by statute as an entity by the Occupational Safety and Health Act of 1970.
  • The Mine Safety and Health Administration (MSHA) has also jumped on the regulatory reform bandwagon. The agency has indicated its intent to review and evaluate its regulations protecting coal miners from black lung disease. This at a time when NIOSH has identified the largest cluster of black lung disease ever reported.
  • EPA actions are also putting workers at risk. Late last year, the EPA announced that it will revise crucial protections for more than two million farmworkers and pesticide applicators, including reconsidering the minimum age requirements for applying these toxic chemicals. Earlier in the year, the agency overruled its own scientists when it decided not to ban the pesticide chlorpyrifos, thus perpetuating its serious risk to farmworkers, not to mention their children and users of rural drinking water. And the agency has delayed implementation of its Risk Management Plan rule to prevent chemical accidents for nearly two years.
  • The Department of Interior is following up on an order from President Trump to re-evaluate regulations put into place by the Obama administration in the aftermath of the Deepwater Horizon accident in 2010, which killed 11 offshore workers and created the largest marine oil spill in United States’ drilling history.
  • And then there’s a new proposal at the US Department of Agriculture that seeks to privatize the pork inspection system and remove any maximum limits on line speeds in pig slaughter plants. Meat packing workers in pork slaughter houses already have higher injury and illness rates than the national average. Increasing line speeds only increases their risk.
Remember and renew

The Trump administration makes no bones about its (de)regulatory agenda. The president boasts about cutting public safeguards and protections, and his agency heads are falling right in line. Our working men and women are the economic backbone of our nation. They produce the goods and services we all enjoy, depend on, and often take for granted. They are our loved ones, our friends, and our colleagues. They deserve to come home from work safe and healthy.

Worker Memorial Day is a time to pause and remember workers who have given and lost so much in the course of doing their jobs. It is also a time to renew our vigilance and be ready to use our voices, votes and collective power to demand and defend rules, standards, policies and science-based safeguards that protect our loved ones at work. Let’s hold our elected leaders and their appointees accountable for the actions they take—or don’t take—to protect this most precious national resource.

This post originally appeared in Scientific American.

How Important is it for Self-Driving Cars to be Electric?

A Waymo self-driving car on the road in Mountain View, CA, making a left turn. CC-BY-2.0 (Wikicommons).

The rapid development of self-driving technology has raised many important questions such as the safety of automated vehicles (AVs) and how they could radically alter transportation systems. These are critical questions, but AVs also have the potential to result in significant changes to the global warming emissions from personal transportation.

An interesting recent study from the University of Michigan and Ford Motor Company lays out the details of the likely changes in emissions from using an AV system on both electric and gasoline cars. The main takeaway from the study is that adding AV equipment to a car adds weight, aerodynamic drag, and electrical power consumption that leads to increased fuel consumption. There is the potential to offset emissions from more efficient driving by connected and automated vehicles, but by far the largest impact on emissions is the choice of fuel: gasoline versus electricity.

Direct emissions versus behavioral and usage changes

Switching from human control to fully automated driving will have direct effects on emissions as well as changes to the amount we use vehicles. Direct emissions changes include reductions in efficiency from factors like increased drag from sensor equipment and the power consumption of required computing and communications equipment. Positive direct impacts could include more efficient driving, such as smooth and precise acceleration control in an automated system.

Automation will also change how we use cars and how much we use them, indirectly affecting emissions, though the effect of AVs on these indirect emissions is much more speculative. While some changes, like “right-sizing’ (for example, having smaller one or two occupant cars available for solo trips), could decrease emissions, many of the usage changes considered would increase vehicle usage and therefore emissions. Making long distance driving easier or more productive could encourage people to live farther from their jobs. Having fully automated vehicles will mean more people can use a car. The elderly, blind, youth, and people with disabilities could switch from transit to a car, or simply add trips that would not have been able to happen otherwise. While many of these uses of AVs would be beneficial, it’s important to understand the potential emissions from AVs and how we could minimize the total contribution of global warming pollution from personal transportation.

That’s why this new study is important: it lets us at least estimate the direct, short-term implications of AV technologies on emissions. While it doesn’t examine the potential impacts of driving more, it does shed light on the direct effects of adding these new features to cars.

AV equipment increases fuel consumption, especially for gasoline vehicles

Focusing on the physical changes to the vehicle, the addition of self-driving and sensor equipment has three major changes to the fuel consumption (and therefore emissions) of the AV. First, the additional weight of the equipment decreases efficiency. Second, AVs that have sensor equipment like cameras and LiDAR (laser-based imaging) often require side bulges and roof-mounted equipment pods. Like a conventional cargo rack, these additions are detrimental to fuel economy as they increase the vehicle’s aerodynamic drag. Lastly, the sensors and computing equipment that enable self-driving require additional electrical power beyond a conventional vehicle. For a gasoline car, this means added load on the engine to power an alternator (and therefore higher gasoline consumption), while a battery electric car will have reduced overall driving efficiency (and therefore shorter range between recharges).

Waymo’s AV minivan adds sensors and computing systems that increase weight, drag, and electrical power consumption. This model was used as an example of a ‘large’ sized AV system in the referenced study. Image source: Waymo

The researchers from Michigan and Ford examined three sizes of AV systems that could be added to vehicles: an AV system with sensors like a Tesla Model S, a medium-sized system with smaller external sensors similar to a Ford AV prototype, and finally a large AV system modeled after Waymo’s modified Chrysler Pacifica AV. While all AV systems have a negative impact on fuel consumption and emissions, the largest impact is seen in the increased drag from the large AV system.

AV systems can increase global warming emission attributed to driving. The largest impact is seen on larger AV systems due to drag from the sensor units.

Improved driving behavior and other savings from AVs are possible in the long run

The study also points out the possibility of fuel savings from having self-driving and connected cars. These savings could come from several sources. For example, AVs could have more efficient acceleration and braking (“eco-driving”), especially if they are communicating with other cars to anticipate speed changes in traffic. AVs could also communicate with infrastructure like traffic signals to reduce idling and stop-and-go driving. On highways, groups of connected AVs could drive much closer together than a human driver could. This ‘platooning’ technology can increase fuel efficiency by reducing aerodynamic resistance, similar to the drafting that competitive cyclists and NASCAR drivers use to save energy. There is also a potential for AV technology to increase fuel consumption because cars could potentially drive safely on the highway at higher speeds and high speeds reduce efficiency.

These factors are currently harder to quantify than the impact of the AV equipment, and some of the potential benefits require having most or all cars on the road be at least connected, if not fully automated. For example, platooning would require multiple AVs traveling on the same roadway at the same time, which would require a critical mass of AVs to be deployed. The researchers in this study estimate a potential emissions savings on average of 14 percent from these technologies if fully implemented. However, they do not consider changes to vehicles that are already producing some of these benefits, such as improved aerodynamics (which gives some of the same benefits as platooning) or stop-start systems (which already act to reduce some of the adverse impacts of stop-and-go traffic and intersections).

Early AV models are more likely to have higher emissions

The study also considered the impact of the much more power-hungry equipment used in early developmental AV systems. For example, early prototypes have been reported to require in excess of 2,000 W of power, mostly for on-board computing. Increased computer power requirements in these early prototypes, for example going from the from these early AVs (see table). This is especially true for the less-efficient gasoline-engine driven vehicles, where increased electric power requirements would increase emissions over 60 grams CO2 equivalent per mile.  That’s equal to reducing the fuel economy of a 35MPG car to 29MPG, or like adding the emissions from running 10-25 iMac computers using a gasoline generator for every car. Since early AVs will not have enough numbers on the road to take advantage of platooning and connected vehicle savings, it is very likely that in the near-term AVs will contribute higher net emissions than a conventionally driven vehicle using the same fuel.

 

Emissions from AV system’s electricity use. Baseline system is 200W computer system, prototype uses 2,000W computing system. AV system size Baseline AV system, battery electric vehicle (gCO2eq/mi) Baseline AV system, gasoline vehicle (gCO2eq/mi) Prototype AV system, battery electric vehicle (gCO2eq/mi) Prototype AV system, gasoline vehicle (gCO2eq/mi) small 3.0 8.0 25.9 70.3 medium 3.2 8.6 26.1 71.0 large 4.3 11.8 27.3 74.1

 

Switching from gasoline to electricity is by far the most important factor in reducing emissions

 

The choice of fuel (gasoline versus electricity) is the most important choice for reducing emissions. Emissions estimates based on Ford Focus gasoline and battery-electric models and includes ‘well-to-wheel’ emissions for fuel production, distribution, and use in the vehicle. Emissions related to vehicle or AV system production are not included in this chart.

The most important determinant of direct emissions from vehicles is not the AV system, but is the choice of gasoline or electricity. Choosing a electric vehicle instead of the gasoline version for this analysis reduces global warming emissions from 20 to over 80 percent, depending on the emissions from electricity generation. The addition of AV equipment only increases this difference, making it clear that electric drive is required to have AVs that maximize emissions reductions.

What will the future hold? Some AV companies, like Waymo (spun off from Google) and Cruise Automation (partnered with General Motors) are using EVs and have plans to continue using electric drive in their AVs. Other companies have been less progressive, such as Ford announcing that they anticipate using gasoline-only hybrids for their AVs. If AVs have the transformative effect on mobility and safety that many predict, it will be vital to encourage the use of cleaner electricity instead of gasoline in these future vehicles.

 

 

7 Times Scott Pruitt Stole an Idea from the Villains of Captain Planet

Captain Planet and the Planeteers (you can be one, too!) was a staple cartoon of the early nineties that I watched when I was a child. It gave me unrealistic expectations that there was indeed a goddess named Gaia who would one day bestow upon me a magical ring with which I would protect the environment.

Only later would I find out that Gaia was actually a drag queen who would not bestow upon me a magical ring but rather the power of fierceness. A quality that is surely as important in environmental protection.

I hadn’t watched the show for a very long time, until recently, when I was reminded of it by Scott Pruitt—the man in charge of environmental protection in the United States. After re-watching many episodes, it has become clear to me that Pruitt’s actions to undermine scientists, deny climate change science, and increase pollution across our country have clearly been taken from the series’ villains’ playbook.

So, Administrator Pruitt…

It’s time that someone gave credit to the Captain Planet villains. Here are at least seven times that Scott Pruitt has stolen their ideas.

1. Administrator Pruitt’s plan to heat up the Earth’s atmosphere by trapping the sun’s rays under a thick layer of air pollution—um, that was the villains’ Dr. Blight’s idea in the episode Heat Wave where she literally traps the sun’s rays under a thick layer of toxic smog, Mr. Pruitt.

2. Allowing pollutants into streams and tributaries, which will affect people’s drinking water. You stole that idea from Captain Planet villain Looten Plunder in the episode Don’t Drink the Water in which he leads efforts to pollute the Earth’s water supplies with various contaminants.

3. Remember that time Administrator Pruitt hijacked the Environmental Protection Agency’s websites and deleted and altered a ton of scientifically backed information so the public remains ignorant to it? The Captain Planet villain Verminous Skumm sure does from the episode Who’s Running the Show, in which he hijacks a national environmental television show and spreads misinformation about environmental protection.

4. When villain Looten Plunder took advantage of a disadvantaged community to continue the use of a toxic pesticide in the episode The Fine Printring a bell, anyone?

Yes, Administrator Pruitt really did that when he decided not to ban the toxic pesticide chlorpyrifos, which affects the neurological development of children (mostly children of color).

5. Assigning conflicted individuals to scientific positions? Clearly, Administrator Pruitt watched the episode Greenhouse Planet that covered this when Dr. Blight (one of the villains) is assigned to be the President’s science advisor and she incorrectly informs him about the harm of greenhouse gas emissions. It’s ironic that in the episode it’s the president’s science advisor with a conflict of interest – our president doesn’t even have an official science advisor.

Any words for that one Neil?

6. Quashing the work of scientific experts? Yea, Captain Planet and the Planeteers covered that too in the episode A Perfect World where Dr. Blight attempts to trash the work of a scientist.

Even when I try to make one of my post humorous, it gets sad as all of this terrible adds up. I wish Administrator Pruitt could feel my sad.

7. One more and I promise that’s it. Basically, just the Trump administration destroying the environment and public health protections like when all the villains came together in Mission to Save Earth to create Captain Pollution.

Even though the Trump administration is borrowing their playbook from the villains of a nineties children’s cartoon, we all know what happens at the end of each one of these episodes—Captain Planet becomes a boss and saves the planet.

Unfortunately, Captain Planet isn’t real. That’s why we at the Union of Concerned Scientists are working so hard to ensure that science remains in its rightful place when your government leaders are making important policy decisions that protect (literally) our lives.

If you’re interested in helping, check out how to do that here! Science has afforded us a lot of protections including clean air, clean water, better health care, and the conservation of many of our critically important plants and animals. I’m not about to give that up. This sassy scientist is still in this fight!

Do Local Food Markets Support Profitable Farms and Ranches?

Local produce, sold through direct-to-consumer channels like farmers markets and community supported agriculture programs, is often sold at a price premium. But does that premium impact farmers’ bottom line? Photo: Todd Johnson/ Oklahoma State University.

How many times have you heard that when you shop locally, farmers win? Families shop at farmers markets, school districts procure locally-grown and raised items, and restaurants curate seasonal menus at least in part because they believe they are supporting the economic viability of local producers. But do we have evidence that these local markets actually provide economic benefits to farmers and ranchers?

For the past decade, we have seen growing evidence that household and commercial buyers are willing to pay a premium for local products, and that farmers capture a larger share of the retail dollar through sales at local markets. But until recently, there was little evidence of the impact of these markets on farmers’ and ranchers’ bottom line.

To better understand the potential of local food markets, we evaluated the financial performance of farmers and ranchers selling through local markets compared to those selling through traditional wholesale markets, which may pool undifferentiated grains, animals or produce from hundreds of producers to sell to large food manufacturers or retailers. We use data provided by the U.S. Department of Agriculture’s Agricultural Resource Management Survey (ARMS), a nationally representative survey providing annual, national-level data on farm and ranch businesses. ARMS targets about 30,000 farms annually, of which about 1,000 report some local food sales.

For this research, we define local markets in two distinct categories: direct to consumer sales (such as farmers’ markets; community supported agriculture, or CSAs; and farm stands) or intermediated sales to local food marketing enterprises that maintain the product’s local identity (such as restaurants, grocery stores, or food hubs).

Local food can spur rural development

The first notable difference between farms and ranches that sell through local food markets and those that do not is that, on average, farms selling through local food markets spend a higher percentage of their total expenditure on labor (8% compared to 5%). Even more interesting is that as local food producers get larger, their share of expenditure on labor increases! (See the green bars in figure 1). This stands in contrast to the ‘efficiency’ story we have long heard in agriculture. Conventional wisdom dictates that as farms scale up, they substitute capital for labor, becoming more efficient and producing more with less. But in the case of local markets, it appears that as the volume of direct and intermediary sales grows, the hours, skills, and expertise needed to manage buyer-responsive supply chains increases, as well. This finding supports the argument that local food can serve as a rural economic development driver; farms selling through local markets require more labor per dollar of sales, thus creating jobs.

Figure 1 Share of Variable Expenses, Local Food Producers, by Scale (Bauman, Thilmany, Jablonski 2018)

 

Do these additional labor expenditures impact the profitability of local producers? To answer this question, we categorized farms and ranches that sell through local markets by size, or sales class—the smallest reporting less than $75,000 in sales, and the biggest reporting $1,000,000 or more. We then broke down each sales class by performance, using return on assets as our indicator for performance, and organized farms and ranches into quartiles (see Figure 2). This categorization allowed us to zero in on the highest performing producers of every sales class.

Though performance varies widely, we found that of all producers with more than $75,000 in sales, at least half were break-even or profitable. Of every sales class – even the smallest!—farms in the top quartile reported returns over 20 percent—very strong profitability for the agricultural sector, where profit margins are generally slim.

What makes a local farm succeed?

To explore patterns in profitability a little bit further, we can compare how various financial measures vary across those with low vs. high profits. Among the top performing quartile, farms and ranches that sell through intermediated channels only or a combination of direct and intermediated channels performed much better than those using direct markets only. This may signal the importance of intermediated markets, and justify support for intermediated market development through grant programs such as the Local Food Promotion Program. Further, using more in-depth statistical analysis of local and regional producers, we found that farms and ranches selling only through direct-to-consumer markets may be struggling to control their costs, and that strategic management changes to these operations could result in significant improvements in profitability.

Figure 2 Local Food Producers Return on Assets by Sales Class and Market Channel (Quartile 4 is the most profitable) (Bauman, Thilmany, Jablonski 2018)

In summary, we see that local food markets provide opportunities for profitable operations at any scale, but that sales through intermediary markets are correlated with higher profitability when compared to producers that use only direct channels.

To learn more about the economics of local food systems (including more about this research), we encourage you to visit localfoodeconomics.com, where we have compiled a number of fact sheets on this topic. We started this community of practice in conjunction with the U.S. Department of Agriculture’s Agricultural Marketing Service and eXtension. The website and listserv serve as a virtual community in which academic, nonprofit and policy professionals can engage in conversations about the economic implications of the many activities that fall under the umbrella of local food. For the broader food system community and consumers, gaining insights on the underlying economic implications of how food markets work may inform their decisions on how they can use their food dollars in ways that impact their community in a positive way. We hope to see you there!

Becca B.R. Jablonski is Assistant Professor and Food Systems Extension Economist at Colorado State University.

Dawn Thilmany McFadden is Professor of Agricultural and Resource Economics and Outreach Coordinator at Colorado State University.

Allie Bauman is Research Assistant in the Department of Agricultural and Resource Economics and Colorado State University.

Dave Shideler is Associate Professor of Agricultural Economics at Oklahoma State University.

This research is supported through the U.S. Department of Agriculture’s National Institute of Food and Agriculture (award number 2014-68006-21871).

 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

EPA Chief Pruitt Even Violates His Own Principles

With Environmental Protection Agency Administrator Scott Pruitt’s job now hanging in the balance, it is a good time to recall that, just after his Senate confirmation, he gave a speech at the Conservative Political Action Conference (CPAC) that emphasized the three principles he said would stand at “the heart of how we do business at the EPA”: process, rule of law, and federalism.

A little more than a year into his tenure, he has violated all of them.

Subverting Process

“Number one,” Pruitt told his CPAC audience, “we’re going to pay attention to process.”

In fact, as we now know, Pruitt has a long track record—going back to his days in Oklahoma—of flouting official procedures when it suits him.

Most troubling is Pruitt’s disdain for EPA policy procedures, which have a considerable impact on public health. Just this week, Pruitt undercut the EPA’s long-established process for drafting strong, protective regulations by proposing that the agency no longer accept studies if all of their data isn’t publicly available. That would mean the agency would have to ignore most epidemiological studies, which rely on private medical information that cannot and should not be shared.

Polluter-funded members of Congress have tried to pass bills instituting this restriction for years, despite the fact that it would violate the EPA’s obligation to use the best available science to protect public health. Sure enough, emails obtained by the Union of Concerned Scientists show that political appointees, not career staff or scientists, were behind the proposal, and they only considered its potential impact on industry. In response, nearly 1,000 scientists sent a letter to Pruitt asking him to back off.

Pruitt also packed the EPA’s Science Advisory Board (SAB) with industry scientists, overturning four decades of precedent by banning scientists who have received EPA grants from serving on the SAB or any other agency advisory panel. Why? Pruitt claims they have a conflict of interest. Pruitt did not renew terms for a number of respected members and dismissed several independent scientists before their terms were up, shrinking the SAB from 47 to 42 participants and more than doubling the number of its polluter-friendly members.

Likewise, Pruitt clearly has little use for standard EPA administrative procedures. The Government Accountability Office, for example, recently found that he violated federal law by ordering a $43,000 soundproof phone booth. Political appointees, it turns out, have to clear office improvement purchases over $5,000 with Congress. Unlike his predecessors, he has routinely flown first class, and so far it has cost taxpayers more than $150,000. He tripled the size of the administrator’s security team to 19 agents, and according to CNN their annual salaries alone cost at least $2 million. He has a 24-hour-a-day bodyguard. He rented a condo for $50 a night—well below market value—from the wife of an energy lobbyist who met with Pruitt last July and lobbies EPA on behalf of his clients. The list of Pruitt’s ethical infractions goes on and on.

Breaking the Rule of Law

“When rule of law is applied it provides certainty to those that are regulated,” Pruitt explained during that CPAC speech. “Those in industry should know what is expected of them. Those in industry should know how to allocate their resources to comply with the regulations passed by the EPA.”

It’s hard to argue with that. Of course industrial facility owners should be clear about their responsibility to curb emissions. Under Pruitt, however, polluters can be certain about at least one thing: There’s a good chance they won’t be prosecuted. For Pruitt, the rule of law is made to be broken.

In its first year in office, the Trump administration resolved only 48 environmental civil cases, about a third fewer than under President Barack Obama’s first EPA director and less than half under President George W. Bush’s over the same time period, according to a February report by the Environmental Integrity Project. The Trump administration recovered just $30 million in penalties from these cases, nearly 60 percent less than the $71 million the Obama administration recovered in its first year.

A December analysis by The New York Times comparing the first nine months of the Trump regime with previous administrations, also found a marked decline in enforcement. It determined that the EPA under Pruitt initiated about 1,900 enforcement cases, about a third fewer than during the Obama administration and about a quarter fewer than the Bush administration over the same time frame.

Meanwhile, Pruitt—who sued the EPA 14 times to block stronger air, water and climate safeguards during his tenure as Oklahoma attorney general—is now trying to roll back environmental protections from the inside. Since taking office, he has moved quickly to delay or weaken a range of Obama-era regulations, including ones that protect the public from toxic pesticides, lead paint and vehicle emissions.

Ironically, Pruitt’s cavalier attitude about following procedures has thus far blunted his wrecking-ball campaign. “In their rush to get things done, they’re failing to dot their ‘I’s and cross their ‘T’s, and they’re starting to stumble over a lot of trip wires,” Richard Lazarus, a Harvard environmental law professor, told The New York Times. “They’re producing a lot of short, poorly crafted rulemakings that are not likely to hold up in court.”

Federalism for all but California

“So process matters, rule of law matters, but let me tell you this: What really matters is federalism,” Pruitt told the CPAC faithful. “We are going to once again pay attention to the states across the country. I believe people in Oklahoma, in Texas, in Indiana, in Ohio, and New York and California, and in all the states across the country, they care about the air they breathe, and they care about the water they drink, and we are going to be partners with these individuals [sic], not adversaries.”

California? He must have forgotten that when he lashed out at the state for embracing stronger vehicle fuel economy standards than what he and the auto industry would prefer. “California is not the arbiter of these issues,” Pruitt said in an interview with Bloomberg TV in mid-March. California sets state limits on carbon emissions, he said, but “that shouldn’t and can’t dictate to the rest of the country what these levels are going to be.”

California, which has a waiver under the 1970 Clean Air Act giving it the right to set its own vehicle emissions standards, reached an agreement with the Obama administration and the auto industry that established the first limits on tailpipe carbon emissions. The next phase of the standards calls for improving the average fuel efficiency of new cars and light trucks to about 50 miles per gallon by 2025 in lab tests, corresponding to a real-world performance of about 36 mpg. By 2030, that would reduce global warming pollution by nearly 4 billion tons, akin to shutting down 140 coal-fired power plants over that time frame.

California wants to stick with the standards. Pruitt, echoing the specious claims of auto industry trade groups, announced in early April that he wants to roll them back. Putting aside the fact that the auto industry’s own analysis concluded that carmakers can meet the 2025 targets primarily with conventional vehicles, what happened to Pruitt’s “cooperative federalism” ideal, especially since California is not acting alone?

Thirteen states, mostly in the Northeast and Northwest, and the District of Columbia have adopted California’s stricter emissions standards. Together they represent at least a third of the U.S. auto market. And in response to Pruitt’s roll-back announcement, 12 state attorneys general and 63 mayors from 26 states released a declaration supporting the stronger standards. “Such standards are particularly appropriate given the serious public impacts of air pollution in our cities and states and the severe impacts of climate change…,” the declaration reads. “If the administration attempts to deny states and cities the basic right to protect their citizens, we will strongly challenge such an effort in court.”

That declaration sounds a lot like what Pruitt endorsed at the conclusion of his CPAC speech, but of course he was referring to state efforts to weaken federal environmental safeguards, not strengthen them. “We are going to restore power back to the people,” he said. “We are going to recognize the regulatory uncertainty and the regulatory state needs to be reined in, we’re going to make sure the states are recognized for the authority they have, and we are going to do the work that’s important to advance freedom and liberty for the future. It’s an exciting time.

“The folks in D.C. have a new attitude,” Pruitt continued. “It’s an attitude that no longer are we going to dictate to those across the country and tell them how to live each and every day. It’s an attitude that says we’re going to empower citizens and the states. It’s an idea of federalism and believing in liberty.”

The CPAC crowd gave him a standing ovation, but the reception he’s now getting from both Democrats and Republicans alike is considerably cooler. At this point, Mr. Pruitt may soon find himself out of a job.

Six Things You Should Know About The EPA’s New Science Restriction Draft Policy

Yesterday, the EPA unveiled a long-awaited policy changing how the EPA can use science in its decision making. Ostensibly about “transparency”, the policy will actually restrict the agency’s ability to use the best available science—and thereby weaken protections for public health and the environment. While many of the provisions were known in advance, there are important details worth pointing out now that we know what the policy looks like:

1. It’s clear: This is about attacking soot and smog protections

It has always been evident that current EPA leadership is keen on dismantling the 2015 ozone standard, but the new proposal makes it abundantly clear that the administration is targeting ozone and particulate pollution protections specifically. The language is prescriptive—focused narrowly on the scientific studies that link air pollutants to adverse health effects, like premature death, heart attacks, and respiratory illness. In other words, the administration is restricting use of the very studies that show the need for air pollution protections. (In breaking with scientific community nomenclature, the proposal refers to these studies as “dose response studies”.) This directly undermines the EPA’s ability to set air pollution standards at a level that is protective of public health—as the agency is legally required to do under the Clean Air Act. The proposal even calls out the 2015 ozone rule specifically, raising the idea of having the new policy retroactively applied to the rule. To be clear, the real-world implications of this policy are likely to extend far beyond ozone and particulate matter—everything from chemical safety of consumer products to pesticide use to water quality—but the administration has showed its hand when it comes to motivations behind this policy.

2. The EPA administrator has absolute power

Tucked into the policy proposal is a provision that gives the EPA administrator notable power over how this would be implemented. He or she gets to decide what information counts and what does not. The administrator can “exempt significant regulatory decisions on a case-by-case basis.” This is a loophole big enough that we could drive a tractor-trailer through (perhaps one that doesn’t meet modern emissions standards?). By allowing the EPA administrator to exempt entire decisions, or just singular studies, this allows political appointees to pick and choose what science is “acceptable” when making (what should be) science-based policy decisions. That politicizes the evidence that can be considered or excluded for arbitrary reasons, making it easier for the EPA administrator to insert politics into existing science-based processes at the EPA.

3. A lot of talk, not a lot of evidence

The proposal provides little justification for the move. In addition to being a solution in search of a problem, the proposal includes no analysis of how this will affect the EPA’s mission-related work and no costs or benefits of such a move. In a proposed rule for public comment, the agency is supposed to present its analysis of the effect of the rule. Do they expect some types of public protections to be harder to implement? Do they expect certain standards will change? Why? And what are the gained or lost benefit to the public? What are the costs in time and resources to the agency and the public?

This lack of clarity on the implications of the proposal is especially concerning in light of a past Congressional Budget Office analysis that found that the similar HONEST Act would cost upwards of $250 million to implement. It is clear that the current proposal would create a tremendous time and resource burden on EPA and by scientists outside of the agency, as they would scramble to (needlessly) track down, collect, compile, and post (including any statistical methods needed to conceal private information) all the data, models, code, etc. for each of the hundreds of studies the EPA cites on any given decision. It is easy to see how this will add administrative red tape, not remove it.

As Senator Tom Carper and his colleagues point out, this policy will likely violate several laws, including the recently updated and bipartisan Toxic Substances Control Act, that requires the use of “best available science” in policymaking. The EPA fails to address this concern in the proposed rule.

4. But how though? The policy is lacking on details.

The proposal is vague about how all of this would happen. How would the Administrator decide what studies would be exempt? Where would the data reside? How would the resources to manage all the data be obtained? Ensuring all this extra information is received, stored, and made publicly accessible would be no small undertaking. The proposal references a system used by the National Institutes of Health to collect health data, but there are barriers and questions to how such a system could apply to the EPA. While the administration asserts that this is about transparency, the NIH system isn’t publicly accessible. There are all kinds of concerns related to disclosing personal health and other data. How will privacy be protected, and privacy laws complied with? Or will this policy simply block the agency from using any study that has relied on protected, private data? It isn’t clear that such processes have been seriously considered by the administration, and they’re hoping the public can figure it out for them.

5. Good laboratory practices, bad idea

The policy makes vague reference to “good laboratory practices.” This sounds, well, good. But it is anything but. This particular standard for good laboratory practices has long been used to ensure the industry studies are weighted heavier than independent studies conducted at universities, for example. Because the scientific community has their own ways of ensuring appropriate practices, many academic institutions do not meet the GLP standard. Historically, this has allowed chemicals with adverse health impacts, like BPA, to stay on the market because the EPA could only rely on industry studies that (conveniently) found no health concerns, while academic studies have. Applying these particular standards more broadly than they are currently applied could restrict the number of independent health studies that EPA can consider in its decision making.

6. The EPA claims support for its proposal; it doesn’t hold up.

Notably, the EPA’s own officials don’t agree that this is a good idea. In emails released to the Union of Concerned Scientists last week through an open records request, assistant administrator and former chemical industry rep Nancy Beck, flagged prior EPA positions on this issue established from court cases:

While the EPA therefore strives to ensure that data underlying research it relies upon are accessible to the extent possible, it does not believe that it is appropriate to refuse to consider published studies in the absence of underlying data … If the EPA … could not rely on published studies without conducting independent analyses of the raw data underlying them, then much relevant scientific information would become unavailable for use in setting standards to protect public health and the environment.

Externally, people don’t agree with this proposal either. The EPA policy proposal cites Science magazine and reports by the Bipartisan Policy Center and Administrative Conference of the United States as aligning with their efforts, but that’s not what they said.

Yesterday Editor-in-chief of the Science family of journals Jeremy Berg pushed back on the claim:

It does not strengthen policies based on scientific evidence to limit the scientific evidence that can inform them; rather, it is paramount that the full suite of relevant science vetted through peer review, which includes ever more rigorous features, inform the landscape of decision making. Excluding studies that do not meet rigid transparency standards will adversely affect decision-making processes.

Today, Rob Meyer of the Atlantic today talked with Wendy Wagner, University of Texas School of Law professor and author of both the BPC and ACUS reports. She too challenged the claims the reporters support the EPA’s proposal.

They don’t adopt any of our recommendations, and they go in a direction that’s completely opposite, completely different. They don’t adopt any of the recommendations of any of the sources they cite. I’m not sure why they cited them.

A new source, but still dishonest

Some parts of the legislation on which the EPA proposal is based were so indefensible that the agency didn’t even bother to propose it. Large portions of the HONEST Act suggest that studies should only be considered in rulemaking if they are reproducible—as if we should expose kids repeatedly to lead or mercury to make sure that the initial studies hold up. We will need to ensure that a final rule does not further limit the science that EPA can use through bogus reproducible arguments.

In the end, this policy will have broad implications for the EPA’s ability to fulfill its mission of protecting public health and the environment. By restricting the science that EPA can use to make decisions, we are forcing them to protect us with their hands tied and blinders on. This doesn’t serve transparency. It doesn’t serve efficiency. And it certainly doesn’t serve the public interest.

Hundreds of Leading Scientists Stand Up for Science Integrity and Plead for Climate Action

Scientists have been justifiably alarmed since the early days of the Trump administration.  Many have voiced concerns on the removal of climate change information from websites, disregard of science on pesticides and air quality and more.  Enter the latest salvo: Yesterday, hundreds of members of the National Academy of Sciences (NAS) signed a letter calling for the administration to reverse its decision to withdraw the United States from the Paris Climate Agreement and to restore scientific integrity to decision making.

Here are three reasons why we should pay attention:

  1. The NAS scientists signers are the top minds in their fields. Elected by their peers as leaders in their respective field of science, all signers are members of the National Academies of Sciences (NAS).  President Lincoln signed the Act of Incorporation for the NAS in 1863 to provide a service to the nation.  These scientists provide voluntary service–usually at the request of Congress or agencies in the Executive Branch.  These members are used to thinking about the implications of their science and providing advice to the U.S. government. In this case, they got together to provide independent advice on matters of grave concern to the fate of our nation and its people.
  1. These leading scientists understand that addressing climate change is urgent. They call on the administration to reverse the decision to withdraw the U.S. from the Paris Climate Agreement. Our intent to withdraw matters since the U.S. is the second largest country emitting carbon dioxide from fuel combustion. The United States originally supported the Paris Agreement and committed to “achieve an economy-wide target of reducing its greenhouse gas emissions by 26-28 per cent below its 2005 level in 2025.” The Trump Administration has reneged on that promise.  As a result, the forecasts for emissions trajectories went up placing in jeopardy the Paris Agreement goal to keep the global average temperature rise this century well below 2 degrees Celsius above pre-industrial level.  This is a threshold that scientists agree we don’t want to cross if we hope to avoid some of the worst consequence for people and other life on this planet.
  1. Considering scientific and technical input is essential for a safe and prosperous nation.  In addition to noting that the U.S. is the only nation to have initiated withdrawal from the Paris Agreement, the central core of the statement is a call to restore science-based policy in government. The statement outlines the problem, “The dismissal of scientific evidence in policy formulation has affected wide areas of the social, biological, environmental and physical sciences.”  Evidence abounds.  Clean air protections have been weakened. There likely would be an unequal burden of injustice for those who live closest to sources of hazardous pollutants.  The Centers for Disease Control staff should not have to restrict the words they use to describe important public health matters and should feel free to respond to reasonable public requests for information. Reassignments of experts to positions outside their area of expertise have occurred. The statement also calls “to appoint qualified personnel to positions requiring scientific expertise.” There is an erosion of confidence in leadership at the highest levels of several agencies.  This list is long…

It takes guts to step out and speak up when you see a wrong.  Take note nation, our top scientists are loudly ringing the alarm bells.

Flint, Michigan Still Waiting for Justice Four Years On

Photo: Lance Cheung/USDA

In April of 2014, Flint, Michigan residents noticed that there was something wrong with their water. As UCS Senior Fellow and noted Boston Globe Opinion writer Derrick Jackson recounts in his lengthy report, only a month after the city of Flint switched to using Flint River, instead of Lake Huron water, community members noticed the difference. Bethany Hazard told reporter Ron Fonger of the Flint Journal that her water was murky and foamy. LeeAnne Walters, recent winner of the Goldman Environmental Prize, noticed rashes on her children. Other residents too noticed the water was discolored, smelled or tasted bad or their families were experiencing health problems.

But state and local officials wouldn’t admit the problem or act…for a long time. In fact it took two years before the state began delivering bottled water to residents, and then only under court order. And another year for the EPA to provide a grant to upgrade the water system while a federal judge ordered the state to provide another $97 million. Just this month, the Governor of Michigan, Rick Snyder, announced that delivery of bottled water to residents is at an end. Ask yourself, would you drink the city water now?

An avoidable crisis

It’s easy to ask, why did all this occur? Did the city NEED to switch water supplies? How could the health and safety of so many residents be ignored? Why did it take so long to respond to residents?

The crisis and disaster occurred because of a decision by Flint and state officials to switch from the Detroit water system, to a new Karegnondi Water Authority that was under construction, ostensibly to save money. There was no pressing need to switch systems except the promise of future savings. And since the new system wasn’t completed, and the contract with Detroit wasn’t renewed, a short-term “fix” was chosen to use Flint River water, but without appropriate treatment with anti-corrosion additives.

So it occurred as a business decision and it didn’t need to happen. The contract with Detroit could have been extended.

But the reason so many residents’ concerns were ignored? There the answer lies in the long history of environmental injustice in our country. The city of Flint has a majority of people of color and many living in poverty. Time and again, health and safety impacts on these communities are ignored, as are the voices of the residents. In Flint, community members were the first to report that there was an issue with their water and began to organize to provide clean, bottled water well before any state intervention. And it is these most impacted communities who continue to speak up for themselves. But it was and is hard to get anyone to listen. It is easy, and appropriate, to single out the government officials from the city and state, including the governor. What about the rest of us?

Scientists and reporters—where are you?

The residents knew what they were experiencing, but not necessarily why. But scientists like Dr. Mona Hanna-Attisha, a Flint pediatrician and EPA whistleblower Miguel Del Toral as well as Marc Edwards from Virginia Tech, did provide critical diagnostic evidence that helped explain what was going on.

But there is a big university community in Michigan, and throughout the Upper Midwest. And many other environmental scientists all around the country working on water quality. If ever a community needed our help, it was and is Flint.

And, as Derrick Jackson reports, the story was slow to get traction in the media. Ask yourself if that would be the case if the crisis was in a wealthier, whiter community. If you have any doubts, see this report that scientists from EPA published in February about environmental racism—it’s behind a paywall but you can read this news article on it, and another study here.

The Flint community is organized and speaking out. They still need and deserve our attention, our support and most of all, they deserve environmental justice.

Ask yourself how you can become a part of the solution; how you can help to bring environmental justice.

Better Data Are Needed to Dismantle Racism in Policing

Photo: Tony Webster/CC BY-SA 2.0 (Flickr)

The institutionalized killing of black and brown people in the United States is not a new phenomenon. The government’s role in the overt harming of black bodies goes as far back as slavery, when patrollers (paid and unpaid) stopped enslaved people in public places, entered their quarters without warrant, and assaulted and harmed them. In the late 19th and early 20th centuries, the government further sustained public devaluation of black lives through tolerance of lynching and by failing to pass anti-lynching legislation.

Today, this institutionalized killing is illustrated by countless racist police shootings—which should be enough to prioritize police brutality on the public policy agenda. However, as we have seen through the (almost complete) failure on the part of the justice system to indict police officers involved in these murders, institutional action is not being taken to address state violence directed at black and brown people.

Dismantling racism in policing, and in other institutionalized forms, in part rests on the better collection and maintenance of data. National representative data on exposure to various dimensions of police brutality can be linked with individual and population health indicators to paint a clearer picture of the impact of police brutality. It can also provide more insight to causes of racial health inequities and inform the formulation of specific policy interventions. We can and must do better at collecting data.

The effects of historical institutional racial oppression cut across several sectors of contemporary American life: health, criminal justice, civic engagement, education, and the economy. I teach Introduction to Public Health at Lehigh University. When I talk about racial inequities in health, I must frame them in the context of racism. I cannot also talk about forms of contemporary racism, such as police brutality, without implicating slavery and its horrors. Without doubt, students ask questions such as “Why did it take the government so long to abolish slavery?” or “Why was it not until 2005, more than a century after lynching began, that the United States Senate apologized for not passing anti-lynching laws?” or “Why were these laws not passed to begin with?” I typically respond by asking if we are a better society now than we were three centuries. Responses range from listing the benefits of the civil rights movement to framing mass incarceration and police brutality as the “new” forms of state-sanctioned structural racism.

But our collective response to police brutality should help us answer questions about why lynching laws lasted as long as they did. Police brutality, which disproportionately targets and kills black and brown people in America, is modern-day lynching. As with lynching, there are perpetrators, unconcerned onlookers, and active resisters. There is also a government that fails to take comprehensive action. In this piece, I aim to focus on what government can do.

The importance of collecting data

Collect data. Data provide necessary evidence for understanding the scope of the problem and to take informed action. Fortunately, the Bureau of Justice Statistics is leading federal efforts to collect more comprehensive data about arrest-related deaths. While this is a step in the right direction, there are still gaps. For example, there are no government-led efforts that mandate active surveillance and reporting of police-related incidents at local and state levels, whether these incidents lead to death, physical injury, or disability. Real-time data from non-governmental sources such as The Counted and The Washington Post help fill this gap but indicate a lack of federal commitment to active surveillance of police brutality—a social determinant of health that disproportionately harms communities of color.

Data are the bread and butter of public policy. In addition to understanding the scope of police brutality, data are relevant for assessing its impact on health, the economy, and other sectors. My current research seeks to identify the mechanisms through which police brutality affects health. The lack of nationally representative data is a problem. In the absence of these data though, I am conducting a qualitative case study to better understand the extent to which stress and poor mental health among Black people residing in the “inner-city” might be grounded in experiences or anticipation of police intimidation and violence.

Moving from collecting data to implementing solutions

Collecting data is important. But our government institutions must also take responsibility for their past and current role in state-sanctioned public harming of black and brown bodies. Real action at the local, state and federal levels are required. One action step is mandating active surveillance of police actions that dehumanize individuals, such as stop-and-frisk practices. Another is to fund research that seeks to understand consequences of police brutality. A third is to prioritize and finance programs and interventions that specifically reduce police brutality and that dismantle racist systems that oppress communities of color more generally.

 

Sirry Alang is an Assistant Professor of Sociology and Health, Medicine and Society at Lehigh University. Her current research explores the connection between structural racism and racial inequities in health.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

A List of Scientific Organizations That Have Supported and Opposed Limiting What Research EPA Can Use to Make Decisions

Photo: Adam Baker/CC BY 2.0 (Flickr)

The EPA today will announce a politically-motivated draft policy to restrict the use of science in agency decisions. The draft policy is based on legislation that has died in Congress for several years.

The mainstream scientific and academic organizations that have opposed new restrictions on EPA’s use of science, in alphabetical order:

The mainstream scientific and academic organizations that have supported new restrictions on EPA’s use of science, in alphabetical order:

¯\_(ツ)_/¯

The former tobacco industry-paid PR men who support new restrictions on EPA’s use of science, in alphabetical order:

California’s Next Climate Change Challenge is Water Whiplash

Today the journal Nature Climate Change published results of a groundbreaking paper that explores the changing character of precipitation extremes in California. The eye-opening results indicate that while overall precipitation levels will not change significantly in the next decades, the state has already entered a period of increased extreme precipitation events that will continue to present tremendous challenges to ensuring stable water supplies. My colleague Dr. Geeta Persad, Western States Senior Climate Scientist at UCS, reflects on the meaning of the results below.

***

The hills on the east side of the San Francisco Bay are a lush, perfect green at this time of year. I drink them in greedily during my daily commute to and from work. Especially because, like any Californian, I know that this greenness will leave with the wet season.

As climate change transforms the water landscape for our state, the greenery feels particularly precious. A new paper out of UCLA’s Institute for Environment and Sustainability today suggests that volatility in California’s water resources is only going to get worse. Its findings drove home for me how much smarter we need to get about managing climate change impacts on water in California.

Water whiplash

Climate change is not just a slow and steady trend affecting water conditions in our state. One of the key findings of the paper led by Dr. Daniel Swain, is that “precipitation whiplash” – a rapid transition from very dry conditions to very wet conditions – is likely to increase with climate change.

Why does this matter? This kind of whiplash is exactly what we’ve experienced over the last few years. We’ve transitioned from the worst drought in California’s recorded history to two wet seasons that produced around $1 billion in flood-related damage and repairs. These projections could mean an increase in extreme wildfire risk and weakened, dried-out soil followed by extreme rainfall and runoff events—the perfect storm of ingredients to produce mudslides like the ones that devastated Montecito earlier this year.

Simulations in this new paper also indicate that, between now and 2060, almost the entire state has at least a 66.6% chance of experience a precipitation event like the one that created the 1862 California flood, which transformed the Central Valley into an inland lake. This means that in the next 40 years, our largest urban centers are more likely than not to experience unprecedented flood events that our modern water management systems and infrastructure have never before had to deal with.

Climate change means more than a change in averages

This paper is especially important because it quantifies climate change impacts on California precipitation—like precipitation whiplash and extremes—that really matter for water management. As long as we keep only planning for averages, we won’t be managing many of the catastrophic risks that climate change creates for water management in California.

In Swain and his coauthors’ simulations, the tripling of extreme precipitation risk, sharp increases in precipitation whiplash, and uptick in extreme dry seasons happen even while average precipitation barely changes. Plus, their projections show a strong shortening of the wet winter season and expansion of the dry summer season statewide. That could create a need for more water storage to bridge between the wet and dry seasons, even without a change in the total amount of water we get each year.

We have a very long way to go to adequately plan and build a safe and reliable water system for the reality of how climate change will impact our water resources and infrastructure.  Studies like this one and others that have come out over the past several years highlight that we have to fundamentally transform how we think about the role of climate in water management in California. Climate change’s influence on the character of California’s water is complicated, but the more of that complexity we integrate into our decision-making, the more likely we are to develop management strategies that avoid the worst outcomes. Luckily, papers like this one show that we have the science to do so.

***

Geeta’s reflections highlight the need to accelerate and intensify work that has recently begun in California. We are just at the beginning of figuring out how to manage water for changing climatic conditions.  Since 2015, UCS has been highlighting the problems of how our current infrastructure is not built for increased drought and flooding in conjunction with more precipitation falling as rain rather than snow. We also have shown that climate change is highlighting the critical importance of increased groundwater management to meet our needs. The UCLA report’s findings on the increase in extreme events further underscores the urgent need for change.

Our highly-engineered water systems in the western states is built for a seasonal regime that is fast disappearing. It is designed to store melting snowpack in the spring for farms and cities to use in the summer and fall. As in the drought of 2012-2016, we can no longer count on sufficient precipitation and snowpack to store sufficient water in dry years. Furthermore, during the last two wet years we found that our systems were in some cases inadequate to deal with some extreme events, most dramatically demonstrated by the near-failure of Oroville Dam in February 2017.

Rethinking how we build for a new normal of extremes

The need to change how we build and manage water infrastructure is only one example of how we need to think differently about the built environment now that dramatic changes from a warming world are taking hold. Yet few of the people who are responsible for how we build roads, dams, canals, bridges, and buildings know what to do with the science that is emerging. That is why UCS sponsored legislation in 2016, AB 2800 (Quirk) to bring scientists and engineers together  to come up with recommendations (due to be released late this summer) on how science can better inform our building decisions.

Global warming is going to necessitate a wholesale re-thinking of how we approach building and maintaining our communities far into the future. Translating science to a form that can be used by engineers and architects is only one facet of the problem, however. It will require we rethink about everything from updating building codes and standards, to improving coordination between local, state, and federal levels of government, to approaching cost/benefit calculations of projects with climate change factors included, to fixing chronic underinvestment in disadvantaged communities so that they can better deal with future challenges.

A future of “whiplash” weather in California is but a microcosm, and a warning for much more uncertain and hazardous conditions as the world gets warmer. We are fortunate that science is improving our ability to forecast these changes, but we need policies and programs to make changes based on what we are learning. Time is very short: this is now a problem of our present, not our future.

Photo: Zack Cunningham / California Department of Water Resources

Here are the “Transparency” Policy Documents the EPA Does Not Want You to See

Photo: US Department of Defense

On April 17th, the Union of Concerned Scientists obtained EPA records through three separate Freedom of Information Act (FOIA) requests demonstrating that a proposed Trojan horse “transparency” policy that would restrict the agency’s ability to use the best available science in decision-making is driven by politics, not science. The records also embarrassingly showed EPA officials were more concerned about the release of industry trade secrets than they were about sensitive private medical information.

Three days later, EPA officials removed the records from an online portal where anyone could review them.

Today, UCS restored public access by posting almost all of the responsive documents (more than 100 out of the 124 responsive records) related to the so-called “secret science” policy.

The documents obtained by UCS provide insight into how political appointees and industry interests, not science, are driving EPA's pursuit of administratively implementing failed anti-science legislation.

The documents obtained by UCS provide insight into how political appointees and industry interests, not science, are driving EPA’s pursuit of administratively implementing failed anti-science legislation.

EPA removed the documents after extensive reporting on the contents, including by reporters at POLITICO, The Hill, E&E News, Reuters, and Mother Jones. In each of those stories, which I encourage you to read, journalists highlighted how EPA officials are attempting to administratively implement failed anti-science legislation advanced by House Science Committee Chairman Lamar Smith, to benefit industries the agency is in charge of regulating and at the public’s expense.

The documents provided a window into the considerations of many agency officials, and showed that a policy that would be a fundamental shift in the way EPA uses science, was driven exclusively by political appointees, not scientists.

There were a number of documents on other topics that were also included in the records that were responsive to our public records request, that we are still reviewing. However, as a result of EPA’s actions, public access was denied.

My colleague and I spent much of our Friday afternoon trying to figure out why the documents were taken down. We repeatedly reached out to the agency, and were informed that the records were removed because of concerns about “privacy information” and “attorney-client communication.” Before posting the documents online, we spent some time going through all of the records and removed any documents that could be considered as private in nature (i.e. family pictures, etc.) or represent such privileged communication.

The irony here is not lost on me, as EPA tries to hide records that are critical to understanding the policy development process while officials try to develop a policy about “transparency” in the agency’s use of science.

The agency on Thursday sent a proposed policy to the White House for review. This means that a policy to restrict independent science can be announced any day now. These documents are critical to reporting around the motivation for the policy and to evaluate EPA Administrator Scott Pruitt’s claims of improving transparency in policymaking at the agency.

So, in the spirt of the presumption of openness doctrine under FOIA, we believe that it is our responsibility to restore public access to these documents. It is up to us, the public, to watchdog EPA and hold agency officials accountable.

You can find the documents here.

Science on Wheels: Meeting a Scientist Right in Your Hometown

I moved to Columbia, Missouri, home of the University of Missouri (Mizzou), five years ago, and I was impressed with the amount of science engagement activities available to the public. Any time of any day of the week there appeared to be something going on: Saturday Morning Science, Science Café on Monday nights, and Science on Tap on Tuesday evenings. An incredible variety of settings to pick and choose from, from auditoriums to cafés to breweries. Topics to satisfy all interests, from chemistry to astronomy to biology. Professors, grad students, undergrads—they were all involved in outreach. I couldn’t believe what a big role science played in the state.

Except, it isn’t in the state, it’s confined to the city. And you don’t have to go very far out of it to realize that it is a thin bubble. Drive 30 minutes south of Columbia to Hartsburg, population 101, home of the renowned Pumpkin Festival, and things look quite different. Science is a distant high school memory. There are no outreach programs readily available in town, and no one is going into Columbia to seek them out. Access is indeed a big challenge in science outreach.

As a land-grant university, the mission of the University of Missouri is to serve all the citizens of the state. Those living in college towns already have access to science, whereas those living in rural areas do not. Hence rural communities are the ones where science outreach could be more impactful. Hartsburg is not that far away from Columbia, but there are thousands of communities just like it that are over two hours away from the closest city. And after a long work day chances are you don’t feel like driving two hours to get to a science talk. So for a change I decided to be the one to drive those two hours, to bring the science to people—right where they are.

As humans we distrust things we don’t know, and often people don’t know science. In rural areas there are typically no opportunities to meet scientists. People living there don’t necessarily know what we actually look like or what we do. I set out to change that to show that science isn’t just something that happens in the Ivory Tower’s labs—it’s used in everyday life. I decided to focus my outreach program on the relevance of research rather than on the research itself. Every scientific pursuit has the potential to transform our lives, and we need to communicate that clearly.

Part of the problem is that after K-12 science disappears altogether from the picture. Ask most adults about the last time they thought about science—“What do you mean? Like in high school?” is the probable reaction. Adults are often left behind in the science outreach effort. Programs often focus on K-12 (the science pipeline!), but we forget about lifelong learning. That is a glaring omission given that over 3 in 4 US citizens are over the age of 18, and it motivated me to focus on this age group.

Science on Wheels members

Over the past summer I developed a program that would meet the needs I had identified. Science on Wheels travels to rural areas in any county of the state that requests it. Four to six graduate students give a five-minute overview of the relevance of their research to everyday life, and then mingle with the adult audience to chat more about science. So far we have reached seven counties, mainly in the central and southeastern parts of the State.

Our crowds are small: we have had audiences as little as one person, and only as big as 30 people. But we don’t consider that to be a failure. It takes more time and capacity than hosting an event in Columbia, but the people we are reaching in rural areas are exactly the ones we need to be reaching. They are the ones who are not typically engaged with science.

Here’s an example of our experience: it was 5:50 pm on a Thursday evening last spring. We had driven over an hour to hold an event, but no one had come in yet. 10 minutes from the official start, things weren’t looking up. There was a passerby, and we were quite forward in trying to convince him to join in. He wasn’t having it: science was not his thing, and besides his wife was expecting him for dinner. Finally, we somehow convinced him, and we ran the program with only him in the audience. It was transformative. Over the space of an evening, he relaxed, started asking questions, and eagerly discussed science. That night we changed someone’s perception of science, and that is most definitely worth our time and effort.

The relationship that adults have with science is often reflected in their voting choices. Therefore, nurturing that relationship is key to ensuring that research may thrive in our country. Someone who understands the value of science may be more likely to vote for legislators who do as well. The tangible outcome? Increased science funding, attention to issues such as climate change and conservation of endangered species, data-driven policy decisions—for the benefit of society at large.

Where to next? This summer I will work on expanding Science on Wheels at the state level. I plan to involve the other three University of Missouri campuses, in order to be able to cover a larger territory and hold more events. A few years down the road, I would like to see other institutions nationwide, especially land-grant universities, take the Science on Wheels model and tailor it to their needs. 90% of Americans can’t name a living scientist. My vision for Science on Wheels is for every resident of the state of Missouri—and one day of the U.S.—to have met with one.

 

Arianna is a Ph.D. Candidate in Volcanology at Mizzou. When she is not sampling molten lava in the field, she is making her own lava in the lab by melting rock samples. She is also passionate about science communication and outreach, and never misses an opportunity to chat about her life as a scientist. Find her on Twitter at @AriannaSoldati 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Internal EPA Emails Confirm that Scott Pruitt’s Secret Science Proposal Is Entirely Driven By Politics

Newly released documents obtained by the Union of Concerned Scientists under three separate Freedom of Information Act requests and first reported on by POLITICO demonstrate that the Trojan horse “secret science” proposal being floated by Environmental Protection Agency (EPA) Administrator Scott Pruitt is entirely driven by politics.

POLITICO writes:

“Since Pruitt announced plans for the new policy last month, researchers and public health proponents have raised alarms that it could restrict the agency’s ability to consider a broad swath of data about the effects of pollution on human health. But documents released under the Freedom of Information Act show that top EPA officials are more worried the new restrictions would prevent the agency from considering industry studies that frequently support their efforts to justify less stringent regulations.”

Limiting the EPA’s ability to use vital public health studies

The documents also confirm that the anti-science chairman of the House Science Committee, Representative Lamar Smith, initiated a conversation with Administrator Pruitt about implementing his long-failed “secret science” legislation through administrative means on January 9, 2018. (See email below).

POLITICO continues:

“But Smith found an ally in Pruitt. The emails indicate that Smith met with Pruitt in early January and show that Pruitt’s staff quickly began working on a directive to “internally implement” the legislation.”

Chairman Smith also argued for and previously introduced legislation to limit the ability of independent scientists who received agency grants to provide EPA advice on its decisions. While the legislation never passed Congress, EPA implemented a similar directive last fall.

While EPA has argued on a partisan website that the policy would be about transparency in science-based decisions, the documents obtained by UCS confirm that this is not the case. The resurrection of Chairman Smith’s misguided proposal is nothing but a political attempt to restrict the ability of EPA to use the best available science to fulfill its mission of protecting public health and the environment.

What the documents show

In the documents released by the EPA, there are no concerns raised about the policy’s impacts on public health protections, or any suggestion to receive feedback from the broader scientific community, which has slammed this distorted effort previously.

However, emails between several EPA political appointees, including Nancy Beck, a former staffer for the American Chemistry Council (ACC), the chemical industry’s trade association; and Richard Yamada, a former staffer for House Science Committee Chairman Lamar Smith, show that the small group was grappling with how to incorporate loopholes and exemptions to limit the impact of the directive on industry data. (See below).

The emails also show that the concerns around confidential business information raised by Beck in her current job as the Deputy Assistant Administrator of the Office of Chemical Safety and Pollution Prevention (which is in charge of protecting the public from risks from toxic chemicals) are eerily similar to concerns she raised about data transparency last year in front of a Senate Homeland Security and Government Affairs subcommittee, on behalf of her previous employer at the time and ardent supporter of Chairman Smith’s ill-conceived legislative proposal, the ACC.

Ultimately, what is crystal clear is that the EPA is still finding ways to abandon the tools that the agency needs to do its job. The proposal, if it is ever released, is not scientifically driven, and is simply a political ploy to undermine EPA’s ability to use independent scientific analysis. You can go through all the documents that were released to UCS here, here, and here.

Email from a staffer for Chairman Smith and an EPA official discussing a meeting between Administrator Pruitt and Chairman Smith in January 2018 to discuss how best to implement “secret science” internally.

Nancy Beck shares ideas with other EPA colleagues on why a company’s scientific studies should be protected because of confidential business information and potentially exempt from EPA’s secret science directive. Her comments are remarkably similar to language she used in testimony (below) before a Senate subcommittee while representing the American Chemistry Council just last year.

Competitive Enterprise Institute Counts Costs But Not Benefits of Safeguards—and Hopes You Won’t Notice

A shell game Photo: emilykbecker/CC BY-NC-ND 2.0 (Flickr)

Today, the Competitive Enterprise Institute (CEI) released another misleading “study” about the “costs” of regulation (read: science-based safeguards, public protections) while virtually ignoring the benefits. They do this every year because some reporters fall for it and it confirms what some elected officials and editorial boards want to believe. Policymakers and the public would be best served by ignoring the latest edition of this report that is nothing more than propaganda to promote the rolling back of science-based safeguards that protect public health, safety, and environment.

CEI trots out an outrageous (and bogus) number about the cost of federal regulations. And of course, the report conveniently fails to look at the benefits. Instead, the anti-government think tank exaggerates numbers, takes data out of context, and tries to have the information they present fit a predetermined anti-regulatory narrative.

CEI's report is in denial about benefits of science-based safeguards and public protections.

CEI’s report is in denial about benefits of science-based safeguards and public protections.

The Washington Post’s factcheckers found “serious methodological problems” in the 2015 version of the report.  The latest version is just more of the same.

The author of the latest CEI analysis suggests that federal regulations in 2017 cost Americans nearly $2 trillion dollars. However, the report fails to take into consideration the many quantitative and qualitative benefits of regulations. In the minimal amount of time it does spend on the benefits of public protections, the paper casually dismisses a congressionally mandated draft report recently released by the Office of Management and Budget, which found that the benefits of major federal regulations from 2006-2016 was somewhere between $287 and $911 billion, and the costs were somewhere between $78 and $115 billion.

As the Coalition for Sensible Safeguards described, this distorted valuation of the cost of regulations is as if a couple deciding to have a baby considered only the estimated cost of raising a child from birth to age 18 but failed to consider the priceless benefits of parenthood. Without the context and inclusion of a thorough conversation about benefits, it would be as if ESPN reported that the San Antonio Spurs won on Monday night by just saying ‘Spurs 101,’ without mentioning that the Golden State Warriors scored 116 points.

The author also argues that science-based safeguards are a “hidden tax.” However, it all comes down to who pays for environmental and public health problems: taxpayers or the companies that create these problems? Responsible businesses comply with common sense public protections because that makes the most business sense—they don’t want to hurt their customers. If they do, who will consume their product? They understand that regulations help create a fair and predictable playing field for all. And when irresponsible businesses don’t comply with regulations, or if there are no regulations on the books (see Facebook), taxpayers are left with the bill, whether that’s for cleanup of toxic rivers and dirty air, increased healthcare costs, or figuring  out how to handle the release of personal information.

Further, what CEI and other opponents of regulation also fail to acknowledge is that the regulatory process is transparent. It’s not ideal, but it’s not a black box (at least historically). Under laws that govern rulemaking and the development of public protections, public input is frequent, and industry groups engage at every step of the process. And public comments are just that—public.

When developing rules, agencies are required to show their work and justify their conclusions with evidence and the best available science. Rules that are arbitrary and capricious are thrown out by the courts. (I think some members of this administration may want to re-read the last couple lines, because that applies to deregulatory actions as well).

CEI’s approach to this study is fundamentally flawed. What the report does tell us is that cost-benefit analysis is not the most useful tool to understanding the impacts of science-based safeguards. Cost-benefit analysis consistently fails to adequately account for the benefits of public protections, many of which are unquantifiable. For example, how can we put a monetary value on a clean bill of health, or that of a life? It’s the result of regulations that we have better public health outcomes in the United States (and yet there is much room for improvement, as the better public health outcomes are not always equitable).

What CEI is advocating for through this report would likely mean less science-based safeguards, and less use of evidence in policymaking. Their approach to regulation is to make all decisions political, which has shown itself not to work very well when it comes to protecting public health, consumers, worker safety, and the environment.

Photo: emilykbecker/CC BY-NC-ND 2.0 (Flickr)

Pages