UCS Blog - CSD (text only)

On Worker Memorial Day: Remember the Fallen and Continue the Fight for Workplace Safety and Health

Photo: Airman Connor J Marth/US Air Force

When you wake up and roll out of bed this Sunday (April 28, 2019), count your blessings. Sure, maybe it’s just another work day for you; it certainly is for the millions of people who will be working in the stores, restaurants, and gas stations you may visit. Or on the buses, trains, and planes you need to get to your destination. Or in the hospitals, nursing homes, and assisted living facilities caring for your loved ones. Or on the farms tending to the crops and livestock that will come to your table. Or in the power plants that will keep the lights on. Or on the beat and standing ready to respond to fires and other emergencies.

Or maybe for you it’s a day at home, chock full of chores, obligations, and family responsibilities. Or a day to catch up.  Or a day of leisure, fun, relaxation, or celebration.

Worker Memorial Day

Whatever Sunday has in store for you, it’s also Worker Memorial Day. In the US and around the world, it’s the one day each year dedicated to recognizing, honoring, and remembering workers who have suffered and have died of work-related injuries and illnesses. It is also the day to renew the fight for workplace safety and health.

Far too many workers still lose their lives, their health, their livelihoods, their relationships, their ability to enjoy life and engage in the routine activities of daily living because of hazards, exposure, and unsafe conditions at work. This seems to be lost on the Trump administration in its fervor to roll back worker and other public health protections.

Data tell (part of) the story

Unless you know someone who has experienced or become disabled by a work injury, or whose health and life have been upended by a toxic exposure on the job, or who died because protective measures were short-changed or lacking altogether, you may think that occupational illness, injury, and death are rare events. More about the past than the present. But data and statistics tell a different, though incomplete, story.

Fatalities: In 2017 the recorded number of fatal work injuries was 5,147.  That’s 14 people dying every day on average. In the United States. Someone’s father, mother, child, sibling, friend, or co-worker who never made it home after work—like these workers in the Houston area or 21-year-old Kevin Hartley, a bathtub refinisher whose life was cut short by exposure to toxic methylene chloride. Fatal falls were at their highest level in the 26-year history of the Census of Fatal Occupational Injuries (CFOI). With everything we know about fall protection, how is this possible? The loss of a loved one because of a preventable incident at work is heartbreaking enough; thinking about last moments of a fall, an asphyxiation or suffocation in a collapsing trench, or being crushed between moving parts of heavy machinery, or being fatally assaulted by a violent patient, client, or customer is horrifying.

Non-fatal cases: According to the Bureau of Labor Statistics (BLS), private industry employers reported 2.8 million non-fatal workplace injuries and illnesses in 2017, nearly one third of which were serious enough to result in days away from work—the median being eight days. BLS reports another 561,400 non-fatal injuries and illnesses among state and local public sector workers in select industries.

Costs: And then there’s the enormous economic toll that these events exact on workers, their families, and their employers. According to the 2019 Liberty Mutual Workplace Safety Index, the most serious and disabling workplace injuries cost US companies more than $55 billion per year. That’s more than $1 billion per week! But that’s just a drop in the bucket. The National Safety Council estimates the larger economic costs of fatal and non-fatal work injuries in 2017 at $161.5 billion. Lost time estimates are similarly staggering: 104 million production days lost in 2017 due to work injuries (34 million due to injuries in prior years and 55 million estimated days lost in future years due to injuries that occurred in 2017).

And even these costs don’t come close to revealing the true burden, as they do not include the costs of care and losses due to occupational illness and disease. A noteworthy and widely cited 2011 study estimated the number of fatal and non-fatal occupational illnesses in 2007 at more than 53,000 and nearly 427,000, respectively, with cost estimates of $46 billion and $12 billion, respectively.

Who bears the burden of economic costs of these largely preventable events? Primarily injured workers, their families, and taxpayer supported safety net programs. Workers’ compensation programs cover only a fraction. See more here, here, and here.

The other part of the story

With numerous studies revealing significant under-reporting of workplace injuries and illnesses (see hereherehereherehere), the true burden of occupational injury and illness is far higher. There are so many incentives not to report these events. Workers may fear reprisal, job loss, or retaliation, especially if they are immigrants or undocumented. Employers may seek to avoid inspection, citation, and increased comp costs, as well as damage to reputation in the eyes of customers, shareholders, and the community. And as most medical professionals are not trained to recognize or even inquire about a patient’s workplace hazards and exposures, the illness or injury may not even be identified as work-related. The reporting of occupational disease, like cancer caused by a chemical exposure, is particularly fraught.

Worker health and safety in the Trump administration 

The federal government has a critical role to play in protecting the health and safety of our nation’s workforce, yet the Trump administration is taking us backwards (see here, here). Between repeals, delays, rollbacks, conflicted nominations, and executive actions, the president has clearly prioritized industry interests over worker health and safety.

It’s sometimes hard to keep up; below are just a few examples that signal bad news for working people. (The litany of bad news for communities and public health in general is considerably longer, but I leave that for another time – or you can see some of it here.)

Counting blessings

As sobering at this story is, there is a silver lining. Workplace health and safety in the US has improved markedly since passage of the Occupational Safety and Health Act in 1970, thanks to regulatory safeguards and the enduring efforts of labor unions, working people, and other advocates.

Kudos also to the dedicated staff in our federal and state agencies who work tirelessly (often against all odds) to establish and enforce workplace safeguards, researchers who study causes and prevention of occupational disease and injury, and finally to those employers who value their workers as more than replaceable commodities and understand that cutting corners on safety is bad for business. Indeed, smart employers understand that attention to workplace safety and health is the hallmark of operational excellence and actually pays.

So, on Sunday, as I reflect on the past, present, and future of our nation’s precious workforce, I will be counting my own blessings and renewing my commitment to continue fighting this administration’s anti-regulatory fervor that takes us backwards on worker health and safety.

Four New (Old) Ways the White House is Trying to Restrict Science for Policymaking

Cartoon: Justin Bilicki

Yesterday, the Office of Management and Budget (OMB) in the White House issued new “guidance” for the Administration to “Improve Implementation of the Information Quality Act”. Unfortunately, it reads like a re-hashing of some of the worst ideas for restricting the use of science in policymaking from the last five years or so. Way back in 2015, when some members of Congress were trying some of these same tricks to tip the scales in favor of regulated industry we summarized them in a Policy Forum article in Science. Here we go again—but this time, the Trump administration is trying to push these changes through unilaterally, the latest round in a long list of efforts to push science to the sidelines.

False transparency

For literally years now, my colleagues and I have been writing about proposals that supposedly are designed to increase the transparency of the regulatory process by requiring that all data and other information be made public. It sounds like a good idea, but isn’t because there are many justifiable reasons that some information used in a scientific study, such as personal medical information, must be kept private. The effect of the requirements to make all data and information public, actually greatly restricts the scientific studies that agencies can rely on, because they can’t and shouldn’t make confidential information public. For example, if a study of health impacts of a pollutant relies on health outcomes in a certain city or town, that health information for individuals may not be released publicly but the study might be critical for understand the population level effects of the pollutant.

OMB takes another shot at this issue but the results are murky. To be fair, they go to great lengths to assert the importance of privacy and to require agencies to protect privacy. But here’s the rub. If an agency does as required to protect privacy, can they use the analysis based on that private data in making decisions? The directive is unclear on this point, but it looks like they can’t or are strictly limited in doing so. And if that is the case, then for example, the EPA which is charged with protecting public health, may be restricted from using public health studies in implementing rules – and that makes no sense at all.

Déjà vu all over again

The directive has a long section on reproducibility of scientific results. And once again this idea is based on an oversimplification of how science works. Maybe we all learned that doing an experiment in a lab many times over can give you confidence in the results and that is the “scientific method.” Made sense in grade school. But lots and lots of critical scientific information and even analyses are not “reproducible” in this sense. Take, for example, the impact of a toxic pollutant on a local community. Should we release it again to see if it is really harmful? Or the study of a natural disaster? Should we wait for it to happen again to reproduce the results? The Environmental Data and Governance Initiative illustrated the many real-world examples of scientific studies that are neither feasible nor ethical to reproduce.

The directive requires that important studies include sufficient descriptions of the data and analyses to allow them to be “reproduced by qualified third parties” to test them. Now, I am all for detailed descriptions of data and methods being made clear and public, but for most studies, that wouldn’t allow the reproduction of the analyses except by statistical simulation. And who are the third parties? Regulated industry. And the directive requires that all computer code be released, even though it may be proprietary. So if you can’t release the data because of privacy concerns and you can’t “reproduce the study by a third party” because they need the raw data, what happens? Does that mean you can use a critical analysis of health impacts because of these barriers? The directive doesn’t say.

Correction or delay?

Finally, the directive requires that agencies allow the “public” to challenge data and technical information. Agencies then must respond in writing and have independent experts review their response before it is finalized. And then the “public” can challenge the response, and around the circle we go again.

Why did I put “public” in quotes? Because essentially this is designed as an avenue for regulated industry to challenge and delay regulatory actions once again. Don’t get me wrong, the public should have the right to comment and raise issues on agency proposals including the science basis for a decision. And an agency should be required to address those concerns in writing as they are currently required to do under the Administrative Procedures Act and other legal requirements. That has been the case for a long time. But this directive takes the process to a whole new level of obfuscation. Essentially it allows a near endless and expensive round of challenges to every detail of the information an agency uses, including on how well the agency complied with this directive itself, until everyone is satisfied – i.e. never. And who has the resources to make all these challenges? Really only the regulated industry. This bad old new idea is an echo of the unlamented “Regulatory Accountability Act” that was proposed before that put a huge number of barriers in the way of agency rulemaking, stuffing the process with ever more bureaucracy for agencies decision-making.

No consultation no evaluation

This memo was issued yesterday without warning, even to those of us who follow science policy closely. It broadly affects the way federal agencies will use science in the regulatory process, changes the interpretation of existing guidance, and inserts a whole new set of requirements. So, you would think a memo of such importance would be accompanied by a careful rationale for the what problems are being addressed and how it should be implemented including the costs of doing so. You would be wrong. There is no such analysis. Other than saying decision-making depends on high-quality information (I agree), the directive doesn’t say how the requirements it contains will address current problems that agencies or the public face with the information that is the basis of regulation. Unfortunately, we have seen that time and again in the Trump Administration – new rules proposed with virtually no coherent justification or analysis of their impacts.

In the case of this directive, even though it doesn’t estimate the costs of implementation as it should, we do have some idea from a GAO report on one of the proposals it contains on making data publicly available. For one agency alone (EPA) the GAO estimated that the cost would be in the hundreds of millions. Where is that money and the time of agency staff to come from? You guessed it, it will be diverted from doing the science that is the basis for public health, safety and environmental protections.

Overall, OMB’s “new” directive on information quality isn’t new, but it is directive. It resurfaces bad ideas that were stopped because – you guessed it- they were bad ideas. But now, even though this memo was issued without any public input or analyses, it can direct agency action across the government. The Trump Administration has gone to great lengths to sideline science in decision making. Congress needs to step in and tell the administration this is not the intent of the law and doesn’t serve the public. It is only of benefit to big regulated industries and the detriment to the rest of us.

From Profound Conflicts of Interest to a Blind Eye for Harassment, Barry Myers Is the Wrong Choice for NOAA

Photo: C-SPAN

When the founder and CEO of AccuWeather was nominated by President Trump to lead the National Oceanic and Atmospheric Administration (NOAA), it immediately raised serious concerns from me and many others. After all, Myers is not a scientist but will be leading a major science agency. His business, AccuWeather, is essentially built around the re-processing, re-packaging, and marketing of the weather data developed and routinely produced at public expense by NOAA. Therefore, much of the work of NOAA in scientific research, data collection, and forecasting of weather, climate, and severe storms directly impacts AccuWeather, which relies on those agency efforts for their business. That is the very definition of a conflict of interest for the putative NOAA administrator.

I worked at NOAA for 10 years as a scientist and a senior manager. It is a great agency that does outstanding science and provides vital services for the nation—weather forecasts, severe storm warnings, tsunami warnings, oceanography, climate science, charting, coastal management and marine resource management including fisheries, marine mammals, and endangered species and habitat. NOAA’s work on behalf of the public is deeply science-based.  Many companies utilize NOAA data (not just weather data), which makes the agency’s work critical to the nation’s economy as well as public health and safety.

Myers was first nominated way back in October 2017 and it has apparently taken nearly two years for Myers to come up with a way to try to address his conflicts of interest. His solution? Sell his shares in the family company to other family members at a reduced price. With a provision to buy them back when he leaves government. In other words, Myers and the White House believe that he can manage a large federal agency that is the basis of his family business by pretending that he doesn’t care about what his brothers, wife, and other relatives earn. And then at the end of his tenure at NOAA he can buy back the shares, again at a reduced price, and share in the earnings. Somehow in his view that’s not a conflict of interest? Right.

I for one am NOT comfortable that Myers will make decisions as the head of NOAA that will solely benefit the public and not his business and his family.  How about you?

As if that weren’t enough to halt all further consideration of Mr. Myers to lead NOAA, it gets worse. Recent reporting has revealed that even before the nomination went to the Senate back in 2017, the Department of Labor opened an investigation into allegations of “widespread sexual harassment” at Accuweather.  And that Myer’s company was aware of the harassment while he was CEO, and took no action for a long period of time. The report of that investigation was available more than a year ago to the administration, even though it was only revealed by the press to the public this month. Still, the White House re-nominated Myers in January of this year.

So, let’s review. Barry Myers is not a scientist. He would have deep and unresolved conflicts of interest if he were to lead NOAA. He is touted as an experienced manager as his primary qualification, but as a manager he led a company that fostered a culture of sexual harassment and workplace hostility.

Myers’ nomination is now approaching a vote on the Senate floor to confirm him for the position. NOAA and the nation don’t deserve an unqualified and conflicted nominee that turns a blind eye to sexual harassment. Under no circumstances should he be confirmed. Tell your senators: Myers? NO for NOAA!

 

Photo: C-SPAN

It’s Earth Day and these 3 Unique (but Endangered) Species are Giving Me LIFE!

Photo: NASA

It’s Earth Day, and this year’s focus is to protect our species. That focus makes me incredibly happy because of three reasons: 1) I get to return to my roots as an ecologist and tell you about some super cool species, 2) there are lots of endangered species that don’t’ receive a ton of attention BUT need attention, 3) this post is not about the Trump administration doing terrible stuff to science (although they haven’t exactly been great to endangered species, you can read about that here, here, and here).

Species #1 – The Ohlone Tiger Beetle

The ohlone tiger beetle, probably waiting to chase some prey.

Some people don’t like insects, but they tap into my awesome nerdy side. As an undergraduate student, I took an entomology class and most of our labs were spent outside catching insects, which was so much fun! But about this cool beetle…

The Ohlone tiger beetle only emerges on land for about 2 months, and it spends that little time mostly hunting. It lurks in the shadows of trails that have been created by cattle and hikers until an unsuspecting passerby comes along and then, BOOM! Dinnertime. The beetle also has been observed chasing its prey in flight. And the larvae of these beetles are no different – the grubs will literally flip backwards to catch prey. Maybe it’s the little kid that is still inside of me, but I really want to see this beetle in action. I imagine if I did, I’d be all like “Wow, bro. That’s sooo cool.” Also, can we just take a minute to appreciate how gorgeous this beetle is?

Unfortunately, the beetle’s population is critically endangered due to loss of habitat to urban development and the impacts of toxic insecticides that come from urban runoff. The species is endemic to California.

Species #2 – The Mississippi Gopher Frog

The Dusky Gopher Frog, once known as the Mississippi Gopher Frog, has an average length of about three inches and a stocky body with colors on its back that range from black to brown or gray and is covered with dark spots and warts.

Who doesn’t love a little frog that’s covered in spots? Or one whose mating call reminds you of the snoring of your significant other (how endearing)? Quite the opposite of the tiger beetle, this critter is not ferocious – this gopher frog places its hands over its little eyes when threatened. I can vouch that this mode of defense is effective, especially when watching horror films.

While this frog used to hop around the Gulf Coastal Plain in Louisiana, Mississippi, and Alabama, a small population of about 200 frogs is all that is left in Mississippi. The species owes its most recent population bump to conservation efforts by US Fish and Wildlife Service (FWS) scientists. These scientists would like to expand their conservation efforts to Louisiana where the frog once lived, but setting aside critical habitat for the species in that state has proved difficult. The decision on whether or not FWS will be able to expand conservation efforts to Louisiana is currently tied up in the courts.

Species #3 – The Southern Bluefin Tuna

If there were a Guinness Book of World Records for fish, the southern bluefin tuna would be highlighted a lot. The bluefin are the largest tuna species and can live up to 40 years. They also can swim to depths of 2,500 meters (that’s about the length of 27.5 football fields). In fact, they can swim to 1,000 meter depth (the length of 11 football fields) in about 3 minutes. “But, Jacob, that’s crazy – the change in temperature from the surface of the water to 1,000 meter depth would be deadly!” You’d be correct for many species, but bluefish tuna are capable of elevating their body temperature up to 20°C above that of surrounding water. Researchers also have found that adrenaline produced from a bluefin tuna’s quick and deep dive helps regulate the beating of its heart. Human hearts could not withstand such a temperature drop – our hearts would fail.

This tuna species is listed in the Guinness Book of World Records once – for being the most expensive single fish sold at a fish market at $3.1 million. Bluefin tuna are prized for their taste and used as sushi and sashimi. These species have been overfished to the point that 85% of the spawning population of this species was lost from 1973-2009. The population is still currently decreasing.

Protect our species

I must admit that I’ve never seen any of these species in the wild, but I’d like to someday. Can you imagine seeing a little green beetle so ferocious that it tries to attack your giant foot along a trail, mistaking a frog’s mating call for the snoring of your tent mate, or seeing a school of bluefin tuna dive thousands of meters below the water surface in a matter of minutes?

While all these species are unique in some way, they all have another commonality: they are critically endangered because of humans. And once a species is gone, we cannot bring it back – we cannot bring back the benefits they bring to our ecosystems, the resources they provide to us, or the joyful experiences they may bring to our lives. Thankfully, scientists and conservationists are working around the clock to help these species populations bounce back. Take a minute on this Earth Day to learn about what you can do to protect our species, and maybe learn a fact or two about an endangered species in your very own backyard.

In the meantime, I’ll be listening to more audio clips of gopher frogs.

Photo: USFWS (Western Carolina University photo/ John A. Tupy)

Legislation to Modernize the California Public Records Act Improves, Advances

Photo: Asilvero/CC BY-SA 3.0 (Wikimedia)

UCS-supported legislation to modernize the California Public Records Act (CPRA) advanced through the California Assembly Judiciary Committee earlier this month, and will soon be heard by Assembly Appropriations. Assembly Bill 700 is intended to preserve the ability of researchers at public universities to pursue highly policy-relevant research without being harassed and attacked by companies and activists who are threatened by their work. The legislation has sparked spirited, productive, highly interesting conversations about how to protect researchers while also allowing for full accountability for public institutions and their staff.

As the legislative process plays out, there has been some confusion about the bill and misrepresentation of its intended scope. To provide clarity about the bill’s intent we created this Frequently Asked Questions document. In the paragraphs below, I reproduce some of the main questions in the document after putting them in the context of our current efforts.

Attacks on research have demonstrable harm

Commercial and industry interests and individuals across the political spectrum are increasingly using broad public records requests to disrupt research, attack and harass scientists, and chill professional discussion and debate. This can and has squelched inquiry, discovery, and innovative research that would benefit the public. In California alone, tax preparation companies, the gun lobby, and the chemical industry have all used public records requests to undermine policy-relevant research. And attacks don’t need to be huge in number: an attack on one researcher can discourage inquiry in an entire field. I and others have written about this extensively, and you can find numerous examples in UCS’s Freedom to Bully report and the “Open Records, Shuttered Labs” UCLA Law Review  article by Claudia Polsky.

Tobacco companies, for example, have used open records requests to gain access to records of scientists studying the impact of cigarette marketing on children and adolescents. There are harms to individual researchers, who suffer harassment, high legal and processing costs, and diversion away from their primary work. Some researchers have even left academic research positions altogether or moved to private universities. Yet the broader, more significant harm is to the public who no longer benefits from the results of important research when researchers abandon the field or stop investigating.

Another example: the California Rifle and Pistol Association Foundation sought all correspondence related to an environmental toxicologist’s work. The toxicologist found that lead in ammunition was poisoning endangered California condors, evidence that led the state to restrict lead ammunition. The requests took a toll on the researcher’s ability to pursue funding, put at risk the unpublished data being disclosed and scooped by others, undermined his collaborations with colleagues, and discouraged graduate students from working with him.

Scientists who volunteered their time and helped plug the hole in the ocean during the Deepwater Horizon oil disaster expressed significant concerns about the public disclosure of deliberative scientific materials, whether it be through open records requests or subpoenas.  “Our concern is not simply invasion of privacy, but the erosion of the scientific deliberative process. Deliberation is an integral part of the scientific method that has existed for more than 2,000 years; e-mail is the 21st century medium by which these deliberations now often occur,” wrote the scientists.

“There remains inadequate legislation and legal precedent to shield researchers and institutions…from having to surrender pre-publication materials,” added their institution, the Woods Hole Oceanographic Institute.

An opportunity to modernize the CPRA

So how did this legislation come together? Last year, University of California Berkeley law professor and public interest advocate Claudia Polsky published a law review article on the increasing weaponization of open records laws. The New York Times wrote about the issue, profiling a UC-Davis tax policy expert who was attacked by the tax preparation industry after he spoke out against efforts to prevent the U.S. government from providing free tax filing services.

Soon thereafter, UCS began talking with Assemblymember Laura Friedman’s office about what legislation to address this problem might look like, and agreed we would work together, with input from numerous stakeholders, to craft a legislative solution. In short, our goal is to protect researchers’ scientific work while preserving public access to documents that could demonstrate sexual harassment, research misconduct, funder influence, misuse of funds, illegal activity, or any other conduct or business that is not explicitly part of the deliberative aspects of the research process. Where and how to draw that line is still up for discussion.

At the Judiciary Committee hearing, some committee members cautioned that without a careful approach, attempts to exempt academic materials could go too far and prevent access to information that rightfully should be in the public domain. For example, there was agreement that we would not want legislation to inadvertently make it more difficult to find out when human or animal study participants are mistreated. Thankfully, a majority of the committee recognized the significant risk to public university research from abusive CPRA requests and expressed confidence that a narrowly crafted bill could meet our twin goals of protecting transparency and quality research.

Seeking input to improve the bill

UCS is a strong proponent of transparency and accountability and has actively encouraged and reached out to other California and national pro-transparency organizations to weigh in on the legislation as it develops. We have met with the California Newspaper Publishers Association, the American Civil Liberties Union, the Electronic Frontier Foundation, the Reporters Committee for Freedom of the Press, and several other organizations to discuss how to narrowly tailor an exemption that protects the research process while allowing for discovery of misconduct or any improper influence on that process.

Every one of those meetings has been helpful to understand the ways that the CPRA has helped uncover misbehavior at public universities, and to help craft language that would allow for this kind of accountability to continue. While the organizations named above are not yet supportive of the legislation, I am hopeful that with the right amendments, they will be.

To me, it is encouraging to see thoughtful, reasoned dialogue about how we solve this problem. This is a conversation that has long been needed, and I’m gratified to see it happening in California. The more voices, the better. To that end, UCS welcomes input from a diversity of perspectives, including those who support the legislation in its current form and from those who think it needs to be improved. Progress depends on input from a variety of perspectives.

What should continue to be public

UCS believes it is critical to maintain public access to the vast majority of documents. The CPRA must continue to enable oversight of California’s public universities, including their funding, administration, and independence. Public access is crucial to understand whether university investigations into research misconduct and discrimination and harassment complaints are adequate. It’s also important to understand the potentially biasing influence of funders, not only on the research process but also on how academics use their research to communicate with the public or influence policy.

Toward this end, AB 700 makes explicitly clear that information regarding research funding agreements, communications among funders and researchers and other university staff, records related to governance or institutional audits of compliance, records related to disciplinary action taken against researchers, records that could demonstrate harassment or other improper behavior, or records not explicitly related to the research process will not be exempt.

What we do want to exempt from disclosure

AB 700 would protect a limited and narrow set of documents where more privacy encourages cutting edge research with little cost to public understanding of that research. This includes unpublished data, unfunded grant applications, and information that could compromise the privacy of research study participants. It also includes communications among researchers during the research process that would preserve their ability to have frank conversations with their peers about the research itself. Access to a researcher’s email correspondence with academic peers to test out and refine ideas provides little to allow the public to better understand the research itself.

Instead, it can and does chill conversation and discourage scientists from fully criticizing the work of others and from pursuing research questions they know will open them up to attacks from powerful interests. Public universities and professors are rightfully subject to open records, but that right should not be absolute. We all deserve the freedom to test new ideas, even and especially when they are contentious. Regardless of your line of work, can you imagine if every email you wrote, every comment you made, or every honest criticism of a colleague’s work was placed in the public domain?

It may be challenging to craft bill language that advances accountability while protecting researchers’ ability to pursue public interest research, but I’m confident that we can meet these twin goals.

Air Pollution Should be Monitored Using the Best Available Science: Meh, Says the EPA

Air pollution causes serious harm to our society – from coughing, to smog in the air, to a visit to the emergency room. And the only way to mitigate the threat of air pollution is to use the best available science and technology to measure it accurately. The Environmental Protection Agency (EPA) appears to disagree. The agency has quietly finalized a rule that ignores its mission to protect human health and the environment by instead focusing on saving industry money.

The change is modifying a 21-year old rule, called the Nitrogen Oxides State Implementation Plan Call, which was designed to curb the emissions of nitrogen oxides (NOx) from industrial facilities in 20 states, mostly on the US East Coast and the District of Columbia. Power plants, as well as large steel, aluminum, and paper manufacturers in these states will now have the option to pick “alternate forms of monitoring” for NOx – none of which are specified – instead of the current standard of “continuous emission monitoring systems” or CEMS. The EPA has justified modifying this rule as a potential cost-saving measure to industry.

CEMS are considered the “gold standard” for source monitoring and are just like what they sound like, technology attached to industrial exhaust stacks that continuously measure air pollution levels. CEMS have proven to be incredibly effective at monitoring NOx pollution for the EPA’s Acid Rain Program (producing highly accurate data 95% of the time). As a result of this scientific evidence, CEMS was codified into regulation as a required monitoring tool for all large electrical and steam-producing industrial sources in 20 states and DC.

The new rule is designed to collect less high-quality data

In the rule’s text, the EPA admits that ditching the requirement for CEMS might allow these facilities “to perform less extensive data reporting or less comprehensive quality-assurance testing,” and that “monitoring approaches may be expected to provide less detailed monitoring data and require less rigorous quality assurance” (emphasis added). The rule could discontinue continuous monitoring of a dangerous pollutant in hundreds of the nation’s largest industrial facilities, leading to data gaps that could significantly challenge the ability to curb NOx emissions.

This is extremely problematic. Measuring air pollution gives us a warning when things go bad, like a canary in a coal mine. If we don’t measure air pollution using the best science available, how can we have enough high-quality information to protect the health and safety of communities living nearby these facilities, communities that are already at risk of breathing in high levels of toxic air?

Nitrogen oxide emissions need to be curbed, and the previous system was working really well

NOx is a family of poisonous gases that can cause you to cough and wheeze, sometimes badly enough to require a visit to the emergency room (especially if you have asthma). This pollutant has a bad habit of combining with other substances to form smog and acid rain. New research has even suggested that NOx is a likely cause of asthma and a risk factor for the development of lung cancer, low birth weight in newborns, and an early death. And the risks are heightened for asthmatics, children, and the elderly.

Fossil-fuel electric utilities are one of the primary sources of NOx pollution and, thanks in part to state and federal regulation, NOx pollution levels from these sources have dropped by 82% over a 20-year period, according to EPA data (1997 to 2017).

Where did this rule come from?

Impetus for the 2019 rule likely originated from the Association of Air Pollution Control Agencies, a shadow group of state air regulators that don’t recognize the federal government’s authority – upheld by the Supreme Court – to regulate greenhouse gases. In a 2017 filing, the group described the continuous monitoring requirement as “overly burdensome” and costly to businesses outside the power sector.

Clint Woods, who was the association’s executive director at the time, is now deputy chief of the EPA’s Office of Air and Radiation. Woods has been previously implicated in suppressing a study detailing the cancer risks from formaldehyde, a clear attack on science and public health. Woods has also influenced the selection of the EPA’s Clean Air Scientific Advisory Committee (CASAC), which is now staffed with individuals who lack the expertise to provide adequate scientific advice.

Part of a pattern of sidelining science in Trump’s EPA

This is not the first time that the EPA, under the Trump administration, has sidelined air pollution science. Industrial facilities now have the opportunity to use less stringent control mechanisms for hazardous air pollutants, like mercury and benzene. A scientific advisory board of experts that once provided valuable information on particulate matter air pollution has now been dissolved. And establishing an air pollution standard may soon require economic considerations instead of being solely focused on improving public health – again, an abdication of EPA’s mission to protect human health.

One of the most important benefits that science can bring to the federal arena is to ground policy in an evidence-based approach. Air pollution policy fundamentally depends on obtaining high-quality, accurate data in order to make any significant health improvements in our communities. By disrupting the collection of high-quality data on a dangerous pollutant, the EPA disregards best scientific practices, decreases its own ability to properly monitor NOx pollution from industrial sources, and undermines its mission to protect public health. If we don’t have an accurate measure of how much NOx pollution is escaping from these facilities, how on earth are we supposed to stop it from causing real harm to our people and our environment?

Public Domain

Administrator Wheeler is Hiding the Truth About Formaldehyde

Photo: Mike Mozart/Flickr

In a letter sent this week, the Union of Concerned Scientists along with the Environmental Defense Fund, Natural Resources Defense Council, and Environmental Protection Network asked EPA’s Scientific Integrity office to investigate what seems to be political interference that occurred at the EPA in its recent suspension of the Integrated Risk Information System (IRIS) formaldehyde risk assessment. In his responses to senators’ questions about the assessment earlier this year, Wheeler claimed that “Formaldehyde was not identified as a top priority.” Political appointees at the agency gave the same answer when asked by the GAO, in a recent report. But, in documents obtained through FOIA request, the Union of Concerned Scientists found evidence that EPA staff was not only interested in the formaldehyde risk assessment, but as of 2017 the air office had a “strong interest in the review and are anxious to see it completed” and told EPA’s acting science advisor, Jennifer Orme-Zavaleta that “we have consistently identified formaldehyde as a priority.” Thus, the glaring omission of formaldehyde among the EPA’s list of prioritized chemicals issued this month smells more like political interference than lack of importance to me.

In a 2017 email obtained through FOIA request by the Union of Concerned Scientists,, an EPA staffer from the air office writes that formaldehyde is a priority for the program.

What happened to the formaldehyde assessment?

We know that the formaldehyde assessment was done and ready for review in the fall of 2017 and then all movement of the draft mysteriously stopped. In July 2017, the head of the National Center for Environmental Assessment which houses IRIS, Dr. Tina Bahadori, wrote to the former Office of Research and Development (ORD) head, Richard Yamada and other ORD staff to talk about a briefing that would occur that month on formaldehyde, and to inform them that “we have contracted the [National Academy of Sciences] to peer review this assessment. As a part of that agreement, we have requested that they convene a PUBLIC workshop in which they also gather data on NCEA/IRIS activities to be responsive to the NRC 2011 and 2014 recommendations to improve IRIS assessments.” The latter part of this agreement occurred in February 2018, but if the NAS was contracted to peer review the assessment, why did the EPA fail to follow through with moving the formaldehyde assessment to the next stage of the process if it was ready and the path was set?

Wheeler wrote this year that the EPA’s air and chemicals offices didn’t provide a list of priorities to him when he asked in 2018. But a non-response doesn’t mean the air office is lacking in priorities. A responsible administrator would follow-up with the program office pointing out the many hazardous air pollutants that have outdated risk values, including formaldehyde. Unless, of course, the administrator was actively working to keep formaldehyde off of the priority list to placate the chemical industry.

After all, the facts haven’t changed—formaldehyde is just as dangerous today as it was a year ago. It seems that political appointees at the EPA are playing a game of defeat-by-delay—willfully remaining ignorant of the facts by simply declining to to listen to the scientific opinions of their own staff experts.

Unfortunately for these appointees, they have a job to do—protect public health, based on the best available science. And they can’t evade their duties by pretending science doesn’t exist. That’s why we have scientific integrity policies in place—to make sure political interests don’t overrule the clear facts and the public good.

Emails show experts’ concern—and political leaders’ indifference

In the aforementioned email from the director of the Health and Environmental Impacts Division at the Office of Air Quality and Planning Services (OAQPS), Erika Sasser, to Jennifer Orme-Zavaleta, she lays out how an updated risk assessment for formaldehyde would help the air office better protect public health. According to her, “having a current cancer unit risk estimate for formaldehyde is critical for the agency’s air toxics program, for use in 1) the National Air Toxics Assessment (NATA), 2) the Clean Air Act (CAA) section 112 risk and technology review (RTR) rulemakings, 3) evaluation of potential risks from on-road and nonroad mobile sources regulated under relevant sections of the CAA, and 4) regional and local-scale risk assessments.” Formaldehyde is not just an incidental air pollutant. Sasser wrote that, “more than 1.3 million tons of formaldehyde are emitted each year. While these emissions are from both natural sources and from stationary mobile anthropogenic sources, the [National Emissions Inventory] estimates that 42,000 industrial facilities emit formaldehyde. The National Air Toxics Assessments (NATA) shows that the entire US population is exposed to formaldehyde.” Sasser’s email was seen by politicals at the agency, forwarded to ORD’s Yamada by Bahadori.

In other documents we received from the EPA, it is clear that Dr. Bahadori spent months trying to get Yamada’s attention to the formaldehyde assessment and its release. In September, the American Chemistry Council wrote a letter to IRIS related to its draft formaldehyde assessment. NCEA’s Bahadori wrote back to the American Chemistry Council’s in October 2017 saying “we hope to complete the draft of this assessment as expeditiously as possible and make it available for public comment and peer review by the National Academy of Sciences (NAS)” and “the only way to demonstrate our commitment to a scientifically robust and transparent formaldehyde assessment is to present the document for public comment and rigorous peer review by the NAS.”

On December 7, 2017, Bahadori wrote to Orme-Zavaleta and Yamada, “Just checking to see if you have an update on path forward for formaldehyde?” She followed up on December 20, 2017: “I wanted to follow up on the path forward for formaldehyde.” After getting a non-committal response from Orme-Zavaleta, Bahadori followed up again on January 2, 2018, “I wanted to follow up and see what the timeline for next steps might be for formaldehyde.” Bahadori was clearly doing her best to push the study through the political roadblock and was ignored. Now, she is being moved away from the IRIS program through the Office of Research and Development reorganization, which sources have told InsideEPA (paywalled), is likely as a result of her “efforts to advance IRIS.” Only in today’s EPA is the penalty for defending one’s own scientific program to be moved far away from leading that very group.

Dr. Tina Bahadori tried multiple times to get the former ORD head, Richard Yamada, interested in moving the IRIS formaldehyde assessment through the publication process.

A risk assessment caught up in layers of interference

We know the draft is done and was completed using rigorous scientific review methods, so why not just move it to peer review and public comment? The answer is simple: industry doesn’t like the findings that formaldehyde is a carcinogen. This assessment has been held up for over a decade thanks to pushback from the American Chemistry Council, that we have documented as a part of our Disinformation Playbook. And now, thanks to corporate capture of the current administration, top political officials appear to be doing the same thing from the inside to benefit their former employers and cronies. Former ORD head, Richard Yamada, was previously employed by long-time IRIS and formaldehyde-study-critic Lamar Smith. Current ORD head, David Dunlap, is a former staffer with Koch Industries of which a major formaldehyde emitter, Georgia Pacific, is a subsidiary. He has recused himself from matters pertaining to formaldehyde, but the agency’s track record on sticking to ethics agreements doesn’t give me the utmost confidence in his pledge.

Bill Wehrum, assistant administrator for the Office of Air and Radiation at EPA had a long list of industry clients (subscription required) at his lawfirm before joining the agency, and has been ignoring offers from his own scientists to brief him on the chemical. And let’s not forget Nancy Beck, a former American Chemistry Council staffer now responsible for implementation of the Toxic Substances Control Act (TSCA) who has spent her tenure at the EPA checking industry’s demands of its wishlist. Formaldehyde will now be taken on by her office, which will mean a longer timeframe and a less comprehensive risk evaluation.

EPA’s scientific integrity office must investigate

As we write in our letter, “The completion and release of the IRIS assessment on formaldehyde would help inform science-based EPA regulations to better protect public health from this chemical. Conversely, permitting the suppression of this study to persist unchecked normalizes political interference at the agency and sends a message to career staff that their knowledge and expertise is not valued.” The EPA’s Scientific Integrity Policy “prohibits all EPA employees, including scientists, managers, and other Agency leadership, from suppressing, altering, or otherwise impeding the timely release of scientific information.” The public has the right to know whether this has occurred in the suspension of the formaldehyde risk assessment at the EPA. Every day that goes by without the scientific information informing new technology and standards that could reduce formaldehyde exposure and related health risks is an egregious affront to the agency’s mission to protect public health.

Check out the rest of the documents we received from the EPA related to formaldehyde here.

Photo: Mike Mozart/Flickr

Science and Transparency: Harms to the Public Interest from Harassing Public Records Requests

Photo: Bishnu Sarangi/Pixabay.

In my work as a professor and researcher in the Microbiology and Environmental Toxicology Department at the University of California, Santa Cruz, I investigate the basic mechanisms underlying how exposure to toxic metals contribute to cellular effects and disease. My lab explores how exposures to environmental toxins, such as lead, manganese, and arsenic can cause or contribute to the development of diseases in humans. For example, some neurobehavioral and neurodegenerative disorders, such as learning deficits and Parkinsonism have been linked to elevated lead and manganese exposures in children and manganese exposures in adults, respectively.

California condor in flight. Lead poisoning was a significant factor precluding the recovery of wild condors in California.

In my career spanning 25 years, I helped develop and apply a scientific method to identify environmental sources of the toxic metal lead in exposure and lead poisoning cases in children and wildlife. I helped develop laboratory methods for evaluating tissue samples, including a “fingerprinting” technique based on the stable lead isotope ratios found in different sources of lead that enables the matching of lead in blood samples to the source of the lead exposure.

In the early 2000s, I collaborated with graduate students, other research scientists, and several other organizations to investigate the sources of lead poisoning that was killing endangered California condors. Our research showed that a primary source of lead that was poisoning condors came from ingesting lead fragments in animals that had been shot with lead ammunition, and that this lead poisoning was a significant factor precluding the recovery of wild condors in California.

Our work provided important scientific evidence of the harm that lead ammunition causes on non-target wildlife, and it supported the passage of AB 821 in 2007 and AB 711 in 2013, which led to partial and full bans on the use of lead ammunition for hunting in California.

Gun lobby attempts to discredit research

Because of our research, I and other collaborators received five public records requests under the California Public Records Act (CPRA) between December 2010 and  June 2013 from the law firm representing the California Rifle and Pistol Association Foundation seeking, in summary: all writings, electronic and written correspondence, analytical data, including raw data related to my research on lead in the environment and animals spanning a six year period. The very broad records requests asked for any and all correspondence and materials that contained the word “lead,” “blood,” “isotope,” “Condor,” “ammunition,” or “bullet.”  The request essentially sought everything I had done on lead research for this time period.

One seeming goal of the requestors was to discredit our findings and our reputations, as made apparent on a pro-hunting website that attempted to discredit our peer-reviewed and published findings. We initially responded that we would not release data and correspondence relating to unpublished research, because of our concern that we would lose control of the data and risk having it and our preliminary findings be published by others. As a result, the California Rifle and Pistol Association Foundation sued us in California Superior Court.  Ultimately, the court ruled in favor of the university and researchers by narrowing the scope of the CPRA requests, and limiting the requests to published studies and the underlying data cited.

Impacts and harms from overly broad public records requests

These very broad public records requests have had a significant impact on my ability to fulfill my research and teaching duties as a faculty member at University of California, Santa Cruz. I personally have spent nearly 200 hours searching documents and electronic files for responsive materials; meeting with university counsel and staff; preparing and sitting for depositions, court hearings, and giving testimony. Our efforts to provide responsive materials are ongoing.

Overly broad public records requests deprive the public of the benefits that such research can bring, such as helping wildlife and endangered species (such as the California Condor) survive and thrive by removing sources of environmental lead contamination.

While these requests have had a personal and professional impact on me as an individual, they have caused broader harms to the university’s mission of teaching and production of innovative research that benefits students, California residents, and the public at large. Impacts include:

  • Interfering with my ability to pursue research funding, conduct research, analyze data, and publish my research because of the time required to search and provide responsive materials that takes away from time invested in other duties.
  • Squelching scientific inquiry, and research communications and collaborations with colleagues or potential colleagues at other research institutions.

By chilling research and discouraging graduate students and collaborators from pursuing investigations into topics that could put them at odds with powerful interests, these types of expansive records requests deprive the public of the benefits that such research can bring, such as helping wildlife and endangered species survive and thrive by removing sources of environmental lead contamination.

Why I support modernizing the California Public Records Act

I chose to testify in front of the California Assembly Committee on the Judiciary in support of AB 700 and the effort to modernize the California Public Records Act to protect the freedom to research and to help  streamline the ability of California public universities to process and manage public records requests. This bill establishes very narrow exceptions for researchers to protect unpublished data and some peer correspondence, which would help prevent task diversion, reputational damage, and encourage inquiry and knowledge production at public universities across the state. AB 700 would also reduce the serious burden from expansive and overly-broad records requests on researchers and on the courts and the long backlog of records requests. I think this bill strikes the right balance between public transparency and privacy for research. Ultimately, the public will be better served if the state provides more clarity about what information should be disclosable under the California Public Records Act.

 

Donald Smith is Professor of Microbiology and Environmental Toxicology at the University of California, Santa Cruz. He received his PhD in 1991 and he joined the faculty at UC Santa Cruz in 1996. He has over 20 years experience and published over 100 peer-reviewed articles in environmental health research, with an emphasis on exposures and neurotoxicology of environmental agents, including the introduction, transport and fate of metals and natural toxins in the environment, exposure pathways to susceptible populations, and the neuromolecular mechanisms underlying neurotoxicity.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Photo: Gavin Emmons Photo: Donald Smith

What to Expect When You’re Expecting the 2020-2025 Dietary Guidelines

Photo: Peter Merholz/Flickr

Pregnancy Advice: Caffeine’s ok. Some caffeine is ok. No caffeine.

Breastfeeding Advice: Start solids at 4 months. Start solids at 6 months. Exclusively breastfeed for one year.

First Foods Advice: Homemade baby food. Store-bought baby food. Spoon feeding. Baby-led weaning.

My experience of being pregnant and having a baby in modern times has meant getting conflicting advice from the different sources I consulted, specifically surrounding nutrition. Depending on the google search or midwife I spoke to, I heard different daily amounts of caffeine suitable while pregnant. Depending on the lactation consultant that popped into my hospital room, I heard different levels of concern about the amount I was feeding my newborn. And now that I’m about to start solid foods with my six-month old, I have heard conflicting information about when, how, and what to start feeding my child. How is it so difficult to find what the body of evidence says about these simple questions that parents have had since the dawn of time? When I discovered that past editions of the Dietary Guidelines didn’t address the critical population of pregnant women and infants from birth to two years, I wondered how it was possible that there was this huge gap in knowledge and guidance for such an important developmental stage. That’s why I’m very excited that the Dietary Guidelines Advisory Committee (DGAC) will be examining scientific questions specific to this population that will inform the 2020-2025 Dietary Guidelines and have recently begun that process.

In the meantime, I will be starting my daughter on solids this week and have been trying to find science-supported best practices. It has been shockingly hard to navigate and I became reminded of the interesting world of the baby food industry that I became acquainted with as I researched and wrote about added sugar guidelines specifically for the 2016 UCS report, Hooked for Life.

The history of baby food and nutrition guidelines

Amy Bentley’s Inventing Baby Food, explains that the baby and children’s food market as we know it today is a fairly new construction, stemming from the gradual industrialization of the food system throughout the last century. Early on in the history of baby food marketing, a strong emphasis was placed on convincing parents and the medical community of the healthfulness of baby food through far-reaching ad campaigns and industry-funded research. The Gerber family began making canned, pureed fruits and vegetables for babies in 1926 and in 1940 began to focus entirely on baby foods. During this time, it was considered a new practice to introduce solid foods to babies before one year. In order to convince moms of the wholesomeness of its products, Gerber commissioned research touting the health benefits of canned baby foods in the Journal of the American Dietetic Association (ADA) and the company launched advertising campaigns in the Journal and women’s magazines. Quickly, Gerber’s popularity and aggressive marketing campaign correlated with the decrease in age of earlier introduction of solid foods as a supplement to breast milk. Earlier introduction of foods meant an expansion of baby food market share, which meant big sales for Gerber.

All the while, there were no federal guidelines issued for infants. Gerber took advantage of this gap in 1990 when they released their own booklet, Dietary Guidelines for Infants, which glossed over the impacts of sugar consumption, for example, by telling readers that, “Sugar is OK, but in moderation…A Food & Drug Administration study found that sugar has not been shown to cause hyperactivity, diabetes, obesity or heart disease. But tooth disease can be a problem.” The FDA study that Gerber refers to was heavily influenced by industry sponsorship, and the chair of the study later went on to work at the Corn Refiner’s Association, a trade group representing the interests of high-fructose corn syrup manufacturers. In fact, evidence has since linked excessive added sugar consumption with incidence of chronic disease including diabetes, cardiovascular disease, and obesity.

Today, the American Academy of Pediatrics (AAP), World Health Organization, and the American Academy of Family Physicians all recommend exclusive breastfeeding until six months using infant formulas to supplement if necessary. AAP suggests that complementary foods are introduced around 4 to 6 months with continued breastfeeding until one year. But what foods, how much, and when is a little harder to parse out. Children’s food preferences are predicted by early intake patterns but can change with learning and exposure, and flavors from maternal diet influence a baby’s senses and early life experiences. There’s research that shows that early exposure to a range of foods and textures is associated with their acceptance later on. And of course, not all babies and families are alike and that’s okay! There are differences related to cultural norms in the timing of introduction of food and the types of food eaten. Infants are very adaptable and can handle different ways of feeding.

There’s a lot of science out there to wade through, but it is not available in an easy-to-understand format from an independent and reliable government source. That’s what the 2020 Dietary Guidelines have to offer.

2020-2025 Dietary Guidelines: What to expect

The Dietary Guidelines for Americans is the gold standard for nutrition advice in the United States and is statutorily required to be released every five years by the Department of Human Health Services (HHS) and the U.S. Department of Agriculture (USDA). These guidelines provide us with recommendations for achieving a healthy eating pattern based on the “preponderance of the scientific and medical knowledge which is current at the time the report is prepared.” Historically, the recommendations have been meant for adults and children two years and older and have not focused on infants through age one and pregnant women as a subset of the population.

The freshly chartered DGAC will be charged with examining scientific questions relating to the diets of the general child and adult population, but also about nutrition for pregnant women and infants that will be hugely beneficial to all moms, dads, and caregivers out there looking for answers.

Credit: USDA

While I was pregnant, my daughter was in the lower percentile for weight and I was told by one doctor to increase my protein intake and another that that wouldn’t matter. I would have loved to know with some degree of certainty whether there was any relationship between what I was or wasn’t eating and her growth. One of the questions to be considered by DGAC is the relationship between dietary patterns during pregnancy and gestational weight gain. I also wonder about the relationship between my diet while breastfeeding and whether there’s anything I should absolutely be eating to give my daughter all the nutrients she needs to meet her developmental milestones. DGAC will be looking at that question (for both breastmilk and formula) as well as whether and how diet during pregnancy and while nursing impact the child’s risk of food allergies. The committee will also be evaluating the evidence on complementary feeding and whether the timing, types of, or amounts of food have an impact on the child’s growth, development, or allergy risk.

At the first DGAC meeting on March 29-3o, the USDA, HHS, and DGAC acknowledged that there are still limits in evaluating the science on these populations due to a smaller body of research. Unbelievably, there’s still so much we don’t know about breast milk and lactation, and in addition to government and academic scholarship, there are really interesting mom-led research projects emerging to fill that gap.

The Dietary Guidelines are not just useful for personal meal planning and diet decisions but they also feed directly into the types of food made available as a part of the USDA programs that feed pregnant women and infants, like Supplemental Nutrition Assistance Program (SNAP); Special Supplemental Nutrition Program for Women, Infants and Children (WIC); and the Child and Adult Care Food Program (CACFP). Having guidelines for infants on sugar intake in line with the American Heart Association’s recommendation of no added sugar for children under two years old, would mean some changes to the types of foods offered as a part of these programs.

Nutrition guidelines will be a tool in the parent toolbelt

 But if there’s one thing I’ve learned as I’ve researched and written about this issue and now lived it, is that while the scientific evidence is critical, there are a whole lot of other factors that inform decisions about how we care for our children. Guidelines are after all just that. As long as babies are fed and loved, they’ll be okay. What the guidelines are here to help us figure out is how we might be able to make decisions about their nutrition that will set them up to be as healthy as possible. And what parent wouldn’t want the tools to do that?

As I wait anxiously for the report of the DGAC to come out next year, I will do what all parents and caregivers have done before me which is do the best I can. I have amazing resources at my disposal in my pediatrician, all the moms and parents I know, and local breastfeeding organizations. Whether my daughter’s first food ends up being rice cereal, pureed banana, or chunks of avocado, it is guaranteed to be messy, emotional, and the most fun ever, just like everything else that comes with parenthood.

Photo: Peter Merholz/Flickr

Uncharted Territory: The EPA’s Science Advisors Just Called Out Administrator Wheeler

EPA Administrator Andrew Wheeler Photo: USDA/Flickr

Yesterday the EPA Clean Air Scientific Advisory Committee (CASAC) published a letter to Administrator Andrew Wheeler making recommendations on the agency’s approach to updating the ambient air pollution standard for particulate matter (PM). Chiefly, the science advisors have now acknowledged the group has inadequate expertise to conduct the review.

We are now in uncharted territory with the EPA in a tough position on both its PM and ozone pollution standard updates. Here are some key highlights from the letter and their implications.

CASAC asks that the particulate matter review panel be reinstated

“The CASAC recommends that the EPA reappoint the previous CASAC PM panel (or appoint a panel with similar expertise) as well as adding expertise…,” the committee members write in their consensus comments.

This is significant. Last October, then Acting Administrator Wheeler left CASAC high and dry by disbanding the particulate matter review panel—a group of experts that boosted the span of expertise CASAC and the EPA had access to in their review. The ~20-person pollutant review panels have for decades augmented CASAC’s expertise, helping to review the EPA’s science assessment on particulate matter and health and welfare effects. Since that time, I (and many, many, many others) have repeatedly called on the EPA to reinstate the panel.

In November, a letter from former ozone review panel members asked EPA leaders to reinstate the panel. In December, a letter from former PM panel members asked the same, and this group sent a second letter in March. A separate letter from former CASAC chairs asked for the panel as well. And an additional letter  from 206 air pollution and public health experts asked that the panel be brought back. This is on top of many other public comments echoing similar concerns from scientists, scientific societies, and other experts in the air quality and health arena. Since November, CASAC members themselves have been saying they need more expertise, but the CASAC Chair had ignored these pleas, until now.

The fact that the committee now agrees it needs the panel is important. It sends a clear signal to the EPA administrator that the process for the review of the science informing the PM standard is inadequate. And the committee lays this out in no uncertain terms, declaring that, “Additional expertise is needed for [CASAC] to provide a thorough review of the [PM] National Ambient Air Quality Standards (NAAQS) documents. The breadth and diversity of evidence to be considered exceeds the expertise of the statutory CASAC members, or indeed of any seven individuals.” This is of course what we’ve been saying all along.

This acknowledgement of needed expertise puts agency leadership in a tough spot, given that just last week EPA Administrator Andrew Wheeler claimed that CASAC had a “good balance of expertise” despite disbanding of the panel. With an administrator who directly contradicts his agency’s science advisors, what’s the EPA to do? One thing is clear, this is an atypical process and it is sure to face legal challenges.

Chair Cox’s views on causality have been cast aside

CASAC Chair Dr. Tony Cox released a draft of this letter on March 7, which included eyebrow-raising language asking the EPA to throw away the time-tested and scientifically backed weight-of-the-evidence approach it has long used to assess the links between air pollutants and health effects. (More on Cox’s proposal and why its problematic in my Science magazine piece here).

In the final letter, this language on manipulative causality has thankfully been outsourced to Cox’s individual comments and a few places where it notes that “some CASAC members think…” This is considerably dampened from the draft letter where it appeared as a consensus recommendation for upending EPA’s weight-of-the-evidence process for developing a science-based PM standard.

The committee still cannot agree on scientific facts

The final letter has maintained language noting that CASAC could not come to agreement on the relationship between fine particle pollution (PM2.5 ) and early death, writing, “CASAC did not reach consensus on the causality determination of mortality from PM2.5 exposure.” This is striking given that the link between fine particulate matter exposure and early death is well-documented. It has been repeatedly demonstrated in different scientific studies, in different locations, and at different concentrations. Past CASACs and PM panels as well as some members of the divided current committee have acknowledged this relationship, and yet some members of the current committee are breaking with past science advisors and the greater scientific community.

CASAC criticizes the EPA’s science assessment

As in the draft letter, CASAC continue to be highly critical of the EPA’s science assessment, insisting on the document’s “Lack of comprehensive, systematic review.”  (But who are they to judge if they’ve already admitted they aren’t the appropriate advisors?) To be clear, it is expected and desired that the committee would have suggestions and criticisms of the science assessment and would want to see a revised draft. (This, after all, is the hallmark of peer-review). However, the tone and extent of the criticism in this letter takes it up a notch.

By contrast, a group of 17 scientists from the disbanded panel, while detailing a number of revisions needed for improving the science assessment, stated, “We commend EPA staff for development of an excellent first draft of the ISA that provides comprehensive and systematic assessment of the available science relevant to understanding the health impacts of exposure to particulate matter.”

Given the committee’s own admission that the group is in inadequate to conduct the review, this does raise questions about whether the group is qualified to offer some of the detailed technical criticisms it does, such as on the adequacy of non-threshold models to estimate health associations at low concentrations and the need for study exclusion criteria.

What’s next on PM and ozone reviews?

The EPA will now decide what to do with this science advice. Will it revise the science assessment and send it back to CASAC or simply declare it does not need a second review? Will the PM panel be reconvened to review a second draft science assessment? What about the timeline for the PM standard update? We know the administration is working on an expedited schedule. Administrator Wheeler has made this clear.

And what about the ozone process? If CASAC has concluded it has inadequate expertise for the PM review, it is difficult to imagine they will feel qualified to conduct the upcoming ozone review, given it relies on a similar breadth of scientific disciplines. (The EPA is set to release the ozone science assessment this spring). EPA leadership failed to convene an ozone review panel last October so CASAC is again poised to review a massive scientific assessment with one hand tied behind its back. The agency could decide to plow forward in the PM standard update process, ignoring CASAC’s advice.

Regardless of what the agency does next, it is clear the process is broken, and its science advisors know it too.

Photo: USDA/Flickr

Three Things EPA Administrator Andrew Wheeler Doesn’t Understand About Ambient Air Pollution Standards

Photo: Eltiempo10/Wikimedia Commons

Last week, EPA Administrator Andrew Wheeler talked to Congress. Members had questions about his recent changes to the National Ambient Air Quality Standards updates for particulate matter and ozone. Wheeler’s comments last week and earlier make clear that he either doesn’t understand or isn’t being honest about how the EPA is proceeding as it sets health-protective air pollution standards. Here’s the reality around three points that Administrator Wheeler isn’t clear on.

1. CASAC doesn’t have the expertise it needs

The Clean Air Scientific Advisory Committee (CASAC) concluded at its most recent meeting that it does NOT have the expertise needed to adequately provide science advice to the EPA on development of the particulate matter standard. The committee’s conclusion directly conflicts with Administrator Wheeler’s comments on the hill this week after the CASAC meeting. Rather than listen to CASAC’s conclusion that it does not have the expertise, Wheeler doubled down on his earlier comments to Congress in insisting the committee has “a very good balance of talents.” It seems someone should give Wheeler the notes from EPA’s own committee’s meeting. Instead of denying this need for additional expertise, Wheeler could and should reconvene the particulate matter review panel that he disbanded last October.

The administrator also appeared confused about what expertise does exist on CASAC. When asked about epidemiologic expertise on the committee, he said, “I believe one person had to resign who I believe was an epidemiologist who we — we weren’t able or we — we haven’t yet replaced that person, if I’m remembering the right board. It was either the Science Advisory Board or the CASAC.” Since the administration appointed new CASAC members last October, there has not been an epidemiologist on the committee—a huge gap given how central epidemiologic evidence is to assessing the health outcomes of ambient air pollutant exposure. Given this shortcoming, on top of the lack of pollutant review panels, it is no wonder that CASAC itself recognizes its need for more expertise on its teleconference two weeks ago.

2. Pollutant review panels don’t slow down the process

In his comments to Congress, Wheeler said that the particulate matter review panel was disbanded because pollutant review panels were slowing down the process of reviewing ambient air pollution standard. “We took a hard look at what was causing the delay because the agency had never met the five-year timeframe for ozone or PM,” he told the Senate Appropriations Committee. This is objectively false and runs counter to Wheeler’s previous statements where he insisted that the panels were unnecessary. The process of ensuring a robust scientific review of air pollution standards is of course not the fastest process in the world. Just as the peer-review process tends to be slow, so too is review of thousands of pages characterizing the state of the science on a pollutant and its health and welfare effects by a group of the top experts in the field. But the pollutant review panels simply augment the expertise of CASAC. The panel’s review of the documents happens in the very same meetings that CASAC already has, and must have, according to Federal Advisory Committee Act rules. Sure, the additional experts in the room from inclusion of the panel might mean longer discussions, or an extra conference call, but this is far from a huge slow down on the process. Instead, a bigger reason that ambient air quality standard updates aren’t speedy is because of the limited capacity on the EPA side. If the agency were given more resources to conduct and prioritize reviews, this could speed up the process—if this were, in fact, the goal of this Administration.

Wheeler claims to be concerned about whether or not the review happens within the Clean Air Act mandated five-year window. It is true that reviews are often not completed within five years, but the courts have generally recognized the need for thorough scientific reviews in standard updates. Instead, the administration is insisting that both the particulate matter and ozone reviews happen by the end of 2020.

3. Science advisors should be chosen based on diversity of expertise not geography

In his testimony, Wheeler asserted that “CASAC members and the members of the Science Advisory Board were selected in large part for geographic diversity, geographic diversity of — of — of viewpoints and backgrounds.” This one should be intuitive. If you took a chemistry class in Cleveland, you probably learned the same thing as a chemistry student in Miami. There is, of course, no reason geography should matter when it comes to understanding of science. Universities, academic journals, and scientific conferences don’t curate activities through a geography lens, and neither should the EPA. Instead what the EPA should do, and always has done, is select members of scientific advisory committees for diversity of expertise. To get the best science advice, the agency should make sure the committee includes experts in diverse areas. For CASAC, that means including experts in atmospheric science, medicine, toxicology, epidemiology, etc. Yet, the current CASAC excludes key areas of expertise like epidemiology.

Wheeler blames the selection of CASAC members on EPA staff saying he, in fact, did not pick the members, EPA staff did. He told Congress last week, “I didn’t hand-select any of the people on the CASAC. They were recommended to me … by the career staff and … and the Science Advisory Board Office.” This is curious given that it is the EPA administrator who decides committee membership. EPA staff always make recommendations to the administrator for who would be good candidates for a committee, given balance of expertise, but there has never been a committee like this, with so little membership from active researchers in the field and instead heavily weighted toward regulators. It is hard to imagine that EPA staff would select such a committee without input from political level staff.

Sacrificing both quality and speed

Wheeler’s need for speed has not yielded results. Thus far, the PM review has not been faster than recent reviews that included the review panel. Currently, CASAC is finalizing its letter to the EPA recommending how the agency should revise its science assessment on particulate matter. This letter will confirm that the committee agrees it doesn’t have the needed expertise and make specific recommendations to EPA staff on the document. The EPA can then move forward without the expert advice its science advisors say it needs, or it can delay the process and reconvene the robust particulate matter review panel necessary for a science-informed process.

As of now, Administrator Wheeler is getting neither speed nor quality out of its particulate matter review. It is looking more and more like he won’t get what he wants out of the particulate matter standard update. But if the EPA fails to set a PM standard based on science, the public won’t either.

Photo: Eltiempo10/Wikimedia Commons

Fires in Texas Spark Interest in Chemical Safety

LadyDragonflyCC/Flickr

Watching the news last week as clouds of thick black smoke billowed over Houston, I worried about my family. They are surrounded by chemical plants. Hearing state and local officials saying there is no air quality issue, and then ordering everyone to “shelter in place” terrified me. In truth, the monitors either weren’t working or were under maintenance, and there didn’t seem to be an evacuation plan. Why not? The law requires one.

In the past month, there have been at least two major chemical fires or explosions at EPA Risk Management Plan (RMP) facilities. The Union of Concerned Scientists has been extensively writing on the RMP rule and its provisions and participated in the victorious court case that required the EPA to implement the Obama era rule.

RMP standards are aimed to provide additional information to communities surrounding facilities, require facilities to coordinate with first responders on emergency evacuation plans in case of an emergency, and to research safer technology alternatives that may make their facility less prone to catastrophic incidents.

Despite these important provisions, the Trump Administration has moved forward with rolling back this rule and in doing so, they have proposed to “remove all preventative measures.”  What is a risk management plan if it doesn’t lower risks? The final rule is being finalized now, and we expect with the concerns we had over the proposed rule that this final rule will weaken standards. The two major chemical fires and explosions this month should demonstrate to the EPA that implementing the RMP protections are the least they could do for environmental justice communities, first responders, and workers to protect their public health and safety.

The KMCO plant explosion in Crosby, Texas occurred on April 2nd at the chemical manufacturing facility when isobutylene ignited. This facility was no stranger to incidents, its past explosion occurred in 2010 causing worker injuries and a worker death. The facility has been cited for lacking an appropriate emergency action plan, benzene leaks, and lack of monitoring. This time around, the explosion killed one and injured two others. Communities surrounding this facility are still seeking information from TCEQ and the facility directly on potential health hazards and air quality monitoring in the wake of the explosion.

The Intercontinental Terminals Company (ITC) fire in Deer Park, Texas on March 17th burned for days and once the fires from the various containers were put out, a new shelter in place order for two additional days was issued due to excessive benzene levels detected. Congressional members representing sections of the Houston area came together to call on TCEQ to provide more information on the air monitoring and information sharing after this fire.

Unfortunately, explosions like these add insult to injury for many communities. People living near petrochemical facilities like these already face disproportionate exposure to toxic emissions from the facilities on a regular basis, in addition to other nearby sources of air pollution, like increased truck traffic around the facility and other industrial and transportation-related pollution and stressors. Preventable chemical disasters only add to the burden faced by these communities on a daily basis.

Yvette Arellano from the Texas Environmental Justice Advocacy Services (T.E.J.A.S.), an environmental justice group working on the ground in Manchester and Greater Houston Area, stated “While regulatory agencies protect these facilities from acts of terrorism, who protects us from these facilities which terrorize us on a daily basis? The simple daily acts of life from brushing our teeth in the morning to going to sleep are made traumatic by these events, and the ITC disaster is yet to be over. We never asked to live a life in which we are scared of being at home, forced to live with plastic on the windows and doors, with no ventilation in a city where temperatures regularly skyrocket to over 100. We are suffering out of sight, made silent, and forced into the shadows-living under dark clouds, not of our making. This is not just, it is not freedom or liberty this is an act of terror on our lives.”

I wish I could be clinical and detached from this issue, but I can’t. My family lives in one of the largest concentrations of these RMP facilities outside of Houston, Texas. Every time I hear about another incident I think of my nieces and nephew and whether they were outside at school when this happened. I worried about them while they were locked in their homes in a shelter in place during the ITC fire and subsequent benzene leak. Like all young children, they deserve to be free to run outside without fear of a chemical cloud keeping them indoors.

These incidents at chemical facilities in Texas are unfortunately perfect examples of why the Risk Management Plan standards should not only be maintained by the Trump Administration but should also be strengthened during this rulemaking process. Congress should hold the EPA accountable and call on them to issue a strengthened RMP rule that provides the communities outside of these facilities better access to information regarding the chemicals on-site, better coordination with first responders to create better safety plans that aren’t limited to shelter in place.

Chemical facilities need oversight and high standards for safety in order to protect environmental justice communities, first responders, and workers. Companies should be held accountable, particularly those with multiple incidents and fines like the two facilities above. They need to be willing to more readily share information with communities so families like mine can make the best decision in case of a fire or explosion and need to take strong measures to reduce the risk of these incidents happening in the future.

LadyDragonflyCC/Flickr

 Xenophobia Run Amok at the National Institutes of Health

Photo: GaryAlvis/iStock

If you’re a scientist, a researcher, a physician, a graduate student, or even a patient visiting our National Institutes of Health (NIH), be sure to bring proof of citizenship with you and be ready to answer questions about your nationality. Otherwise you may be turned away. Even if you’re an international member of our prestigious National Academy of Medicine, or maybe a patient enrolled in a clinical trial.

While a Homeland Security Presidential Directive has been in place since 2004 (in the wake of the 9/11 terrorist attacks), the Washington Post reports that something has changed at our nation’s medical research agency. In their words, “NIH—a research institution built on collaboration—is apparently following the same protocol used by federal security agencies that deal with highly sensitive or classified information and require top secret security clearance for their employees.”

This practice has unnerved researchers both inside and outside the agency. And with good reason.

Xenophobia run amok

Science at the NIH and elsewhere is a global effort. It’s a collaborative enterprise that thrives on diversity and has nothing to do with citizenship. But the NIH now finds itself caught (and caught up) in the anti-immigrant fervor of the Trump administration. As we’ve noted earlier, this is not good for science, scientists, and scientific progress.

What’s happening at NIH is not good for patients and their visitors to the campus. It is not good for ensuring a future pool of talented NIH researchers. And it is certainly not good for public health and for people who rely on the world-class science and research at NIH—be they providers or recipients of health care. It’s just another misguided effort that drags immigration policies into science. (Readers may also recall that earlier in the year, the Environmental Protection Agency (EPA) decided to end research grants to scientists who are not US citizens and permanent residents, nor support work in government labs if done by foreign nationals.)

Good science does not depend on passports, nationality, or citizenship

Who knows where the next scientific breakthrough, innovation, or great idea will come from? Maybe from some foreign-born researcher toiling over a dataset or experimental results in one of our national labs. Maybe from the Iranian graduate student who was turned away from NIH when he arrived for a job interview—with an invited presentation.

What are the consequences if the US no longer welcomes scientists from other nations to our laboratories—or subjects them to undue scrutiny and interrogation? We all lose. Good science does not depend on passports, nationality, or citizenship.

Speak out

What’s happening at NIH is wrong—and should set off alarm bells in the scientific, medical, and public health communities. The health and well-being of our nation will suffer when we close our eyes and doors to working with colleagues from other nations.

Collaboration and openness are fundamental to scientific progress; the xenophobic policies of this administration run counter to our ambitions and our values. Let’s replace signs at our national institutes and laboratories that ask for proof of citizenship with signs that say “We welcome you, and your talent and ideas. We do science here, not politics.”

Let’s raise our collective voices to make this happen.

Photo: GaryAlvis/iStock

Trump Administration Proposes Sponsorship Opportunities for Federal Agency Staff Positions and “Assets”

Sports fans have long been familiar with their favorite stadiums being sponsored by companies. Under a new Trump administration proposal, positions within federal agencies—and, incredibly, “assets” such as federal lands—would be open to sponsorship by the highest bidder. We’d strongly encourage you to submit public comments on this ridiculous proposal here.

Under the plan, companies could sponsor and pay for the salaries of political appointees at a 500% premium. So with an adequate donation, the 3M Environmental Protection Agency Administrator, or the American Petroleum Institute Assistant Secretary for Fish, Wildlife, and Parks, may become a reality as early as September 2019.

The sweeping proposal would also allow for the renaming of specific government “assets”, including national parks. And naming rights would not just be for companies and trade groups, as proposals from individuals would also be considered. Imagine if Yellowstone becames RogerStone, Joshua Tree becomes Josh Groban Tree, Bryce Canyon becomes Bryce Harper Canyon! At this point, it seems like a foregone conclusion that Arches National Park would eventually become the Golden Arches National Park.

Under this foolish proposal, federal “assets” such as Joshua Tree National Park could be renamed after companies, trade groups, or individuals. Photo: National Park Service.

Some industry trade groups, of course, lauded the move with generic, meaningless press statements. “These are exactly the types of public-private partnerships that will grow the economy and enable companies to take care of business,” said Cory Upshin, press secretary for the U.S. Chamber of Commerce.

Agencies would also have the option to set up auction sites to name specific positions. Some expect that it might work similar to an Amazon headquarters-style competition, where companies and trade groups submit applications to fund some of the more competitive positions. Such activity is not unheard of under the Trump administration: the USDA currently has numerous communities and even a private citizen in Pennsylvania positioning themselves to be the new home of two agencies that Secretary Sonny Purdue wants to move out of Washington.

Naming rights for positions at both private and public universities are not uncommon. Many buildings and faculty positions are named after wealthy donors, and sometimes those donors try to inappropriately influence both research and classroom instruction.

Each sponsorship would need to be approved by a new advisory committee called the Commission on Labeling, Ownership, and Weighted Name Sponsorship. Some science advisory panel members who were recently canned from giving the EPA advice on particulate matter pollution would reportedly consider applying. “While this isn’t quite my definition of public service, at least I might be able to prevent the most offensive names from moving forward,” said Dr. Brandy N. Ames, one of the affected scientists.

EPA Administrator Andrew Wheeler announced immediately that the agency would begin implementing the proposal before it is finalized, much as it has with other proposed rules over the past two and a half years.

In other news, polluting industries can kill as many birds as they want as long as they don’t do it on purpose, Trump advisors believe that more air pollution is good for you and particulate matter pollution doesn’t cause health problems, the Census Bureau wants to ask a question it hasn’t tested, the National Park Service wants you to shoot hibernating bear cubs in their dens, a former coal lobbyist is in charge at EPA, and the head guy at the Department of Interior used to lobby for the oil and gas industry and worked to dismantle the Endangered Species Act.

Again, you can submit public comments here. It all seems like some kind of joke, doesn’t it?

Five Takeaways from the EPA Meeting on Particulate Pollution

Photo: Steven Buss/Flickr

Yesterday, the EPA Clean Air Scientific Advisory Committee (CASAC) had a teleconference to discuss their recommendations to the administration on the agency’s assessment of the science on particulate matter (PM) and health. The meeting continued the ongoing push and pull between the EPA, its science advisors, and the committee chair Dr. Tony Cox.

The committee was meeting to finalize a letter they will jointly send to the EPA recommending changes to its draft science assessment on the state of particulates and health, a key document that informs the EPA’s (statutorily required to be science-based) decision on the level of particulate matter that protects public health. (More background and information on the state of play in my Scientific American piece here.) Here’s five top takeaways from the meeting:

1. CASAC admits it doesn’t have the needed expertise

The committee finally agreed on what was obvious to everyone else: They need more expertise. Since EPA leaders dismissed the particulate matter review panel last October and selected a new set of CASAC members that don’t include key expertise like epidemiology, it has been abundantly clear that the current seven-member CASAC is insufficient to review the hefty and wide-ranging scope of the EPA’s science assessment. I, and many other scientists, made this point in public comments at the December meeting. And a letter signed by 206 air quality in public health experts asked the administration to reinstate the panel. Members of CASAC themselves echoed these concerns but in December the chair pressed forward, ignoring them. This time, he conceded they needed more expertise and the group agreed to put language in their final letter that asks for a reconvening of the particulate matter review panel or one with comparable expertise, plus a few additional expertise areas they indicate needing.

This new consensus is important. The committee now has disagreed with EPA Administrator Andrew Wheeler who told Congress earlier this year that the panel had the needed expertise to conduct the review. In response to a question from Senator Carper asking about the dismissal of the PM panel, Administrator Wheeler said, “I believe the current CASAC has the experience and expertise needed to serve in this capacity as well as to complete the reviews for the particulate matter and ozone NAAQS.” The committee now admits they do not, raising questions about whether the EPA will be able to obtain the best available science advice necessary to set a science-based PM standard that protects public health, as the Clean Air Act requires.

2. CASAC members pushed back on the chair

While at the December meeting the Chair was able to override most disagreements raised by others, in this meeting CASAC members were more willing to speak up and disagree with the Chair on letter contents. As a result, many of the most damaging elements of the draft letter that the Chair released on March 7th were removed. The draft letter had uncharacteristically strong critiques for how the EPA conducted its science assessment calling the robustly lengthy and exhaustively referenced document “unverifiable opinion” and accusing the EPA of not following the scientific method. The committee has thankfully agreed to strike this language. It is less clear if committee members were able to push back on all of the problematic language in the letter, but the final draft should be substantially less hostile to EPA’s science assessment than the version Cox drafted.

3. The scientific community stood up

The broader scientific community is not sitting this one out. Last week I released a paper in Science with Harvard data scientist and air pollution and health effects expert Francesca Dominici. The paper took on Cox directly for his fringe ideas about how the EPA should approach assessing links between air pollution and health outcomes like early death and respiratory disease. Many other top experts in the field gave in-person comments or submitted written comments criticizing the process and scientific approach being taken by CASAC chair. There were also organizational comments from the Health Effects Institute and the International Society for Environmental Epidemiology, as well as a letter signed by 17 members of the dismissed PM review panel. These critical comments build on public comments submitted and delivered at CASAC’s December meeting, including comments from former CASAC members and former PM review panel members, and former ozone review panel members. In short, the top experts in air pollution and health are in strong unified opposition to the approach being taken by Dr. Cox and this meeting made that abundantly clear.

4. The chair has not moderated his fringe views

Dr. Cox was criticized in my Science piece and elsewhere for his views far outside the mainstream scientific community on air pollution and health. While Dr. Cox expressed surprise with this characterization, mentioning my Science piece explicitly, he did not moderate his views throughout the meeting, noting that he is “appalled” with the lack of evidence for the connection between particulate matter and early death, a relationship that scientists have studied and confirmed in many studies, over many years, in many locations around the world, using different study designs.

5. The process is broken

The (well-designed, in my opinion) process for developing air pollution standards is now broken for this PM standard update. This started long before the current CASAC was appointed. Last spring in his “Back to Basics” memo, former EPA Administrator Pruitt made clear he intended to expedite the process for updating the particulate matter and ozone standards and create conditions that made it harder for robust science advice to inform National Ambient Air Quality Standards. Pruitt and now Administrator Wheeler made good on that promise by tearing down the scientific supports that ensure a robust scientific process with ample opportunities for public input.

EPA is now in a tough spot. It would be difficult in any event to complete a PM review by 2020 as the administration intends. Doing so was made more difficult by the administration nixing the PM review panel. It is now made even more difficult by CASAC’s intention to ask for the panel to be reinstated, a move that would surely mean more public meetings, document drafts, and a general delay in the process.

Alternatively, the administration could move forward, ignoring CASAC’s request for more expertise but in doing so, they are almost certainly setting themselves up for legal challenges. If CASAC itself acknowledges they don’t have the scientific expertise to conduct a science-based review, how can the administration claim to have set a science-based standard? Based on yesterday’s discussion, there are likely to be some elements of the final letter that still conflict with the broader scientific community’s opinion on EPA’s approach, despite pushback from several committee members and nearly all public comments. We will see what the final letter to the administration from CASAC looks like, but one thing is for certain, this process is broken.

 

My written comments from the meeting are here and below are my oral comments and clarifying comments made at the meeting yesterday.

 

Oral Comments Delivered at the March 28, 2019 CASAC Teleconference:

Thank you for the opportunity to comment. I am the research director at the Center for Science and Democracy at the Union of Concerned Scientists. On behalf of more than half a million citizens and scientists, we advocate for the use of science for a healthy planet and a safer world. The Center for Science and Democracy works to advance the roles of science and public participation in policy decision-making. We have never advocated for an ambient air quality standard different from the CASAC recommendation, only to ensure the proper process is followed and scientific advice is heeded.

The Clean Air Act requires that the EPA set particulate matter (PM) standards at levels that protect public health and welfare with an adequate margin of safety. CASAC is charged with considering all available evidence and providing science advice on the standards. At this stage in the PM standard update, there are significant challenges to both the science and process that CASAC is following. 

Scientific Issues

The ISA deserves to be scrutinized and improved by experts on all facets of the assessment. And CASAC’s review of the ISA should be helping EPA to identify new research questions and to refine its characterization of the state of the science. However, this has not been the case.

It is crucial that CASAC rely on the wealth of knowledge in the published literature, as reflected in the ISA draft. CASAC should rely on the established approach for assessing the causal links between particulate pollution and health impacts, as detailed in the preamble to the ISAs.  The causal framework employed by the EPA has evolved over the past decade, has been endorsed by 11 prior CASACs and 138 experts, and has been deemed adequate in the courts.

Yet, the March 7 draft letter by the CASAC chair proposes upending this scientifically backed and time-tested approach. The chair’s proposal would create an unattainable burden of proof on the scientific community to demonstrated causal links between PM reductions and changes in health outcomes, as it is not feasible or ethical to design and carry out population-level manipulative causation studies. 

Importantly, following the chair’s proposal is incompatible with CASAC’s charge to recommend PM standards that protect public health with an adequate margin of safety including sensitive subpopulations. Protecting groups such as the elderly, children, and those with lung diseases, with an adequate margin of safety requires the EPA to consider all evidence and use expert judgement. Relying on a framework that discounts epidemiologic evidence and requires manipulative causation for all causal determinations made by the agency is unlikely to meet this Clean Air Act mandate.

Process Issues

A flawed process produces a flawed result. Thus far, CASAC has not followed a process that is likely to lead to a science-based recommendation to the EPA Administrator. Significant gaps in expertise remain, given EPA leadership’s choice of CASAC members and the dismissal of the PM review panel.

Despite persistent calls for additional expertise by CASAC members, echoed by public comments, the CASAC chair has continued to press forward without addressing these concerns. As a result, the PM NAAQS review is proceeding without the science advice needed to ensure a health-protective standard.

This lack of expertise has been abundantly clear in CASAC meeting discussion and written comments from CASAC members. Rather than discussing key areas of uncertainty and the implications of new important research on particulate matter and health, as would be most helpful for the EPA to hear in deliberations from its top science advisors, CASAC instead has spent its valuable time within an expedited review process questioning and renegotiating well-established concepts, such as the value of the field of epidemiology, the importance of studying effects on at risk populations, and the connection between particulate exposure and premature death.

The proposed changes to EPA’s causal framework, expedited time frame, planned merging of documents, combined with gaps in expertise and limited public input opportunities—together—are likely to undermine the ability of the EPA to set a science-based standard for particulate matter, protective of public health.

Following the chair’s proposal and agreeing to these other changes prevents the EPA from relying on the best available science. I urge the members of CASAC and the EPA to listen to the recommendations of top experts in the scientific community and reject this proposal.

 

Clarifying comments made in response to CASAC Discussion:

On the discussion of two views of science, it seems there is conflation of individual studies and a review of existing studies. CASAC’s charge with respect to the ISA is to look at the body of evidence and the strength of the evidence. It is not CASAC’s job to decide how the broader scientific community should approach individual study designs. There is a time and place for that. It is not in the midst of an established EPA regulatory process. I’d like the committee to ask EPA staff for their perspective on this point and how EPA does approach a systematic review of the literature.

With respect to the discussion of the alternative framework proposed by the chair, I’d like to clarify that in the Science piece, we referenced and reacted to the Chair’s own words as presented in the draft letter under discussion as well as in the December meeting. The proposal would indeed reject most of the key studies EPA relies on for a causal assessment on long-term exposure to PM and mortality, a link that the chair has stated he is questioning. This is what Francesca Dominici and I, along with several other commenters are expressing concern about.

I would very much welcome the featuring of accountability studies and discussing them to the extent that they are informative to the ISA. I agree it is useful to have testable rules. But we cannot overrely on them in a context where we are using observational data to study an environmental risk. Instead of allowing these ideas to be introduced, debated, peer reviewed, and advanced in the scientific literature, the proposal suggests that this process be largely skipped and force fit into the EPA process when as we’ve heard several times today, this is not ready for prime time.

Lastly, the address the true versus estimated exposure conversation, true exposure is not knowable. Much work in the scientific literature, including my own, have characterized this issue in depth. The premise of what we are looking at is what is knowable. The Chair is calling this estimated but it is in fact the relevant value because the measure that is germane to improving an ambient standard is the “regulatory ambient” i.e. the relationship between ambient concentrations and health outcomes. There is no reason to complicate this with an explanation of the obvious fact that studies are using estimates.

Given the substantive changes to the letter being discussed, it seems another public call is warranted to ensure CASAC members have time to review and approve of the revised letter, also keeping in mind FACA rules on transparency of committee deliberations.

Photo: Steven Buss/Flickr

Equality, More or Less: How the Supreme Court Might Fix Gerrymandering

This week the Supreme Court prepared to make voting rights history ahead of the 2020 Census redistricting cycle. Justices heard oral arguments in two partisan gerrymandering cases: a Republican gerrymander in North Carolina (Rucho v Common Cause) and a Democratic gerrymander in Maryland (Lamone v Benesik). Plaintiffs in these cases are seeking relief and a standard to rein in state legislative attempts to maximize partisan advantage through the manipulation of district boundaries.

Oral arguments laid out in unusually clear fashion why these difficult questions have left the Justices grappling for answers. Taken together, the arguments create a frottage that reveals fundamental limitations on the fit between constitutional protections and our electoral institutions.

Why the obsession with proportional representation?

During both cases a lot of time was spent discussing whether the standard for a fair partisan map is proportional representation, or a close correspondence between the share of votes a party earns statewide and the share of legislative seats it wins. At one point in an exchange with Mr. Kimberly (representing Maryland plaintiffs) over the fairness of a 5/3 Democratic/Republican split, Justice Kavanaugh claimed “that shows…the overwhelming driver is proportional representation…Do you think the Constitution requires proportional representation or something close to proportional representation?”

After Mr. Kimberly responded that he did not see a textual indication in the Constitution, Justice Kavanaugh gets more specific:

Kavanaugh: “Equal Protection Clause does not suggest to you something where political groups are treated roughly equally?”

Kimberly: “Your Honor, if that’s the way that you’re inclined to think about it, I’m certainly…

Kavanaugh: “No I’m just asking why…”

Kimberly: “Happy to have you rule that way.” (Court breaks out in laughter.)

Kavanaugh: “…challenging the maps but running away from proportional representation, even though…there’s a suggestion that really it all comes back to proportional representation in some respects.”

There is one respect under which equal protection, or political equality, does require proportional representation. Not as a matter of what the Constitution says, but as a matter of fact. That is, the only electoral system that gives exact equal value to every voters’ party preference is pure proportional representation, where those preferences are reflected perfectly in the percentage of seats that each party wins.

So on the one hand, it is a fact that equal treatment under the law is truly maximized only under proportional electoral systems, where every vote contributes to seat shares. On the other hand, Congressional elections are currently prohibited by law from using proportional electoral systems, and as Justice Kavanaugh later reminds Mr. Kimberly: “Justice O’Connor and Justice Kennedy have made very clear in various opinions that the Constitution contains no such guarantee.” As currently interpreted, whatever the Constitution requires falls short of proportional representation.

A standard fit for a district?

The lower equality standard reflected in current Constitutional interpretations is well understood among both election law experts and political scientists. Justice Breyer has noted the tension between single-seat districts and proportional representation. The most prominent metric of partisan advantage in political science, Gelman and King’s partisan asymmetry statistic, is a measure of vote dilution that accounts for  disproportionalities inherent in our single-seat, winner-take-all elections. Asymmetry measures the difference in seat shares that each party’s voters receive in a plan for the same statewide vote share, say 50%. A second component of the Gelman and King model, responsiveness, captures that inherent disproportionality that emerges when voters shift support from one party to another, typically resulting in a “winner’s bonus.” A system that is less responsive to shifts in support is evidence of a more durable gerrymander.

As responsiveness decreases (the slope flattens), a party’s share of seats can withstand more loss of voter support. Lack of responsiveness is evidence of a durable partisan gerrymander.

And it is within this asymmetry framework that we find a test that parallels the three-prong approach established for deciding racial gerrymandering cases in Thornburg v Gingles. It is described in the Amicus Brief submitted in the North Carolina case (in favor of neither party) by professors Bernard Grofman and Keith Gaddie. Their proposed test requires that plaintiffs first demonstrate that the opposition party has been deprived of partisan advantage in at least one district in the enacted plan in the same manner as Gingles: targeted voters must be a large and compact enough group to create a majority district without diluting their advantage in other districts. Second, the opposition voters must exhibit polarized partisan voting. Voters who regularly shift support between parties would not provide an advantage to either.

Third, plaintiffs must demonstrate vote dilution at the district level, through either a district where the opposition party regularly loses, or in a majority district where voters could be allocated more efficiently across districts. Finally, because changing parties is an option whereas changing race is not, the responsiveness statistic can be used to demonstrate the durability of a partisan gerrymander.

Justice Kagan’s expressed concern that the Maryland plan “flips the composition of the district from 47 percent Republicans and 36 percent Democrats to, instead, 45 percent Democrats and 34 percent Republicans, effectively ensuring that Republicans will never win this seat again…” and is excessive can be answered by including durability in the test.

Using responsiveness as a means of estimating durability is especially appealing as it is distinct from proportional outcome expectations. In fact, this final element of the test could require that very disproportional plans be upheld. Consider an extreme hypothetical where the difference in party affiliation in Maryland’s eight districts is only one person in each district. That is, each district is nearly perfectly split 50/50. In such a case, one person changing their vote changes control of the district, and if each pivotal voter in the district votes Democratic, Democrats win all eight seats, which is the least proportional plan. However, responsiveness is also maximized if it only takes eight people to flip every district, demonstrating that the plan is not a durable gerrymander.

Finding asymmetry demonstrates that one party’s voters are experiencing vote dilution at the state level. This partisan gerrymandering test goes further to identify where the dilution is occurring, how durable it is, and how a remedy would change an existing map. The constitutional logic is parallel to what has been long established and upheld in racial vote dilution cases.

Chief Justice Roberts pointed out that the extent of gerrymandering has changed. Justice Kavanaugh acknowledged that partisan gerrymandering is a problem for democracy. Could a majority of justices recognize that this court’s legitimacy depends, at least in part, on its ability to prevent partisan interests from dictating electoral outcomes at the expense of voters? They have a remedy. It won’t solve the problem of political inequality in all elections, but it is probably the best we can do with what we have. If what we have is worth saving, it is imperative that the Court act.

Public domain

How the Chemical Industry Deployed the Disinformation Playbook on PFAS

The Senate Environment and Public Works Committee will convene tomorrow for a hearing on the federal responses (or lack thereof) to the risks associated with the class of toxic chemicals known as PFAS, inviting representatives from the Environmental Protection Agency (EPA), Department of Defense, Agency for Toxic Substances and Disease Registry (ATSDR) and National Institute of Environmental Health Sciences to testify. It has been encouraging to see Congress conducting oversight on the government’s failures to protect us from PFAS. While the federal government is responsible for its regulatory inaction once it learned of PFAS’ dangers, the companies who created, manufactured, and processed these chemicals and then dumped them unrestricted into our environment—fully understanding their persistence and even toxicity—should be required to answer and pay for their behavior.

Today, we added a case study to our Disinformation Playbook that explores how DuPont and 3M chose to bury unfavorable research linking PFAS to health issues, a decision that staved off regulatory scrutiny and allowed the companies to continue to profit while workers in their facilities, and the rest of us downstream, faced the consequences. This is only one of the strategies from the disinformation playbook that the makers of PFAS, and the chemical industry trade associations that they are part of, have employed to undermine public health.

The Fake

In our new case study, we illustrate how DuPont and 3M used “The Fake” by concealing health studies linking exposure to increased rate of tumors, liver damage, and birth defects:

In the 1960s, for example, DuPont researchers found PFOA could increase liver size in animals. According to the New York Times, other documents revealed that by the 1990s, the company knew that PFOA caused multiple types of cancerous tumors. The company did not share its knowledge with the public, regulators, or even largely its own workers, who faced elevated levels of cancers and the possibility of giving birth to children with birth defects, among other health effects.

DuPont was not the only company to engage in such corporate disinformation. In early 2018, the Minnesota Attorney General’s Office released documents showing that the chemical company 3M had also concealed and downplayed the dangers of PFAS for decades. 3M, which invented PFOA and used another variety of PFAS called PFOS in its popular product Scotchgard, had conducted scientific studies in the 1970s that showed the toxicity of the chemicals, but did not turn over any of its science to the Environmental Protection Agency for more than 20 years.

The Diversion

PFAS makers and users have also used “The Diversion,” a strategy to manufacture uncertainty about the science and deceive the public. As the body of evidence linking PFAS exposure to assorted health effects has grown, 3M, DuPont, and the chemical industry trade associations that they are affiliated with have continued to use disinformation to fight off regulations. The Centers for Disease Control and Prevention’s (CDC) Agency for Toxic Substances and Disease Registry (ATSDR) issued a toxicological profile on PFOA, PFOS and a handful of other PFAS variants and determined that the risk levels for PFAS were 7 to 10 times lower than the EPA’s current health advisory. In 2018, 3M helped form the Responsible Science Policy Coalition, an advocacy group that has cast doubt on the findings of the ATSDR report and other science showing the health effects of PFAS. This organization joins the ranks of scores of other front groups with innocuous-sounding names that promote disinformation and fight tooth and nail against regulation that would protect public health. My colleague, Michael Halpern, described some of their methods in a blog post last year:

In July, the Responsible Science Policy Coalition surfaced at a meeting of the Council of Western Attorneys General where they expressed being “eager to help your state with your issues.” In their presentation to the attorneys general, the RSPC argued that there are “lots of problems with existing PFAS studies” and that these studies “don’t show the strength of association needed to support causation.”

The RSPC also submitted a comment on the ATSDR draft toxicology assessment that extensively detailed why, in their view, ATSDR’s scientific approach was sub-par.

The Fix

The infiltration of state and federal governments by individuals pushing the chemical industry’s agenda has hamstrung further regulation of PFAS to protect public health, in a textbook use of “The Fix.” In West Virginia, several employees at the state’s Department of Environmental Protection ended up working for firms hired by DuPont. At the national level, starting in 2003, DuPont’s PFOA strategy for the EPA was led by former EPA deputy administrator Michael McCabe, and his successor at EPA also joined DuPont’s efforts after leaving the agency in 2003. DuPont had access to inside information at the agency and drafted quotes for EPA officials, a practice McCabe later said was “customary.”

In 2006, a draft report by the EPA’s Science Advisory Board found PFOA to be a likely human carcinogen. In response to the report, an internal DuPont email noted that “In our opinion, the only voice that can cut through the negative stories, is the voice of EPA. We need EPA…to quickly (like first thing tomorrow) say the following: Consumer products sold under the Teflon brand are safe.” A few weeks later, EPA issued such a statement. McCabe denied that EPA made the statement in exchange for DuPont’s phase-out of PFOA, while EPA has declined to comment.

Last year, the revolving door between industry and EPA’s Office of Chemical Safety and Pollution Prevention impacted the release of the aforementioned ATSDR report. In May 2018, documents we obtained revealed that the White House and EPA blocked a draft government study on PFAS after a Trump administration official warned the study’s release would lead to a “public relations nightmare.” The Trump administration has close ties with the chemical industry, and EPA employee Nancy Beck—one of the employees involved in the effort to bury the study—worked at the American Chemistry Council, a chemical industry trade association, before joining Trump’s EPA. The documents were released the week before an EPA conference on PFAS in which community members and journalists—but not industry employees—were shut out from attending. In mid-June, after significant bipartisan Congressional pressure, ATSDR finally released its report.

As the disinformation drags on, local action is powerful

As the same players use the same tired old plays to divert attention away from dangerous chemicals and real solutions, the agencies that have promised to keep us safe from PFAS by figuring out how to regulate them and clean them up have been failing to do so. Just last month, EPA released its long-awaited PFAS “action” plan that was seriously lacking in any real action. In lieu of meaningful federal action, states like Vermont, New Jersey, and Minnesota have set enforceable drinking water standards stricter than EPA’s health advisory. Michigan’s Governor Gretchen Whitmer yesterday directed the Michigan Department of Environmental Quality to begin the process of establishing standards for PFAS in the state. States have taken the lead not only in setting enforceable drinking water and groundwater standards, and passing legislation that further regulates these chemicals, but also in holding companies accountable for poisoning our waterways and bloodstreams.

Take my home state of New Jersey, in which the governor just this week ordered 3M, DuPont, DowDuPont, Chemours, and Solvay to assess and eventually clean up the very expensive PFAS-related pollution in the state. This comes after class action lawsuits have compelled DuPont and 3M to pay West Virginia, Ohio, and Michigan residents for PFAS-related pollution. And as states work to hold companies accountable, grassroots organizations and community members are standing up to fight corporate power across the country.

As Congress continues to consider legislation related to this class of chemicals, it is essential that there are provisions holding companies responsible for polluting, and for spreading disinformation in order to keep on polluting. Companies should be held liable for all of the damage they have done in lives lost and harmed and natural resources destroyed. You can help by contacting your members of congress to urge them to join the recently created PFAS task force (and scientists- you can use your expertise to encourage Congressional oversight here).

EPA Needs to Trust Its Own Scientists and Protect Us from Ethylene Oxide

Photo: Roy Luck/Flickr

Later this afternoon I will be providing comment to the EPA at a public hearing related to its proposed rule on facilities producing hydrochloric acid (HCl). In addition to HCl, many of these sites emit ethylene oxide, a flammable colorless gas that EPA’s Integrated Risk Information System (IRIS) determined was carcinogenic to humans back in 2016. According to the proposed rule, communities near these facilities experience a lifetime cancer risk of 600-in-1-million, which is six times EPA’s unsafe level. What is the agency doing to protect people from this risk? There is no regulatory action proposed in the rulemaking—instead, the agency is asking for comment on the use of the IRIS ethylene oxide risk value for “regulatory purposes,” calling into question the work of its own scientists in the IRIS program.

The IRIS program conducted a systematic review of toxicological and epidemiological evidence that took ten years to complete and included interagency review, input from the EPA Science Advisory Board, and public comment. It concluded that ethylene oxide is carcinogenic to humans, causing an increased risk of cancer of leukemia, lymphoma and breast cancer in women. Recently released National Air Toxics Assessment (NATA) data incorporating the new IRIS risk value revealed that the probability of developing cancer from air pollutants was beyond the EPA’s acceptable level of risk, and 91 percent of the risk can be attributed to ethylene oxide, formaldehyde, or chloroprene. The threat of cancer from ethylene oxide is real and present in so many communities across the country. In places like St. Charles, Louisiana right in the backyard of the largest ethylene oxide emitter in United States, exposure to ethylene oxide is just one toxin in a chemical cocktail of industrial exposures that the community faces. Areas like St. Charles rely on the EPA to use its own rigorous assessments of the science to set health-protective limits. EPA has no time to waste complying with industry requests to question its own science. It must act with urgency to use its own science to protect all of the people whose lives are at risk due to ethylene oxide exposure.

Here’s my full comment:

Good afternoon, I would like to thank the EPA for the opportunity to provide this comment today. My name is Genna Reed. I am the lead science and policy analyst at the Center for Science and Democracy at the Union of Concerned Scientists. The Center for Science and Democracy at UCS advocates for improved transparency and integrity in our democratic institutions, especially those making science-based public policy decisions.

I am here today to urge the agency to cease consideration of the IRIS ethylene oxide cancer risk value in the proposed rulemaking for hydrochloric acid (HCl) production source category on the National Emissions Standards for Hazardous Air Pollutants (NESHAP). Ethylene oxide is included in the rulemaking because these facilities are often collocated with those that use and emit this chemical. The rule asks for comments on the use of the updated IRIS value for “regulatory purposes.” This is ill-advised. The agency should understand best the history of the EPA IRIS assessment on the carcinogenicity of ethylene oxide issued in 2016 which incorporated public comment opportunities, interagency review, and scientific peer review by EPA’s Science Advisory Board. The IRIS risk value is based on the best available science regarding health effects from this chemical. Agency policymakers evaluating regulations for the HCl production source category should not seek to disregard this established science, especially when the facilities addressed in this rulemaking are only one part of a serious problem. Questioning the use of the IRIS cancer risk value is out of the authority of the Office of Air Quality Planning and Standards program, and to do it within a rulemaking on the source-focused proposal for an entirely separate chemical would set a dangerous precedent. Burying this request for comments in a rulemaking on HCl appears to be an attempt by the agency to limit community and expert input, while dismissing its own scientific experts within the agency.

The EPA IRIS program serves a critical scientific service to EPA and to the public, providing assessments that inform the decisions that protect us from hundreds of environmental contaminants. The IRIS program is housed in the National Center for Environmental Assessment within the Office of Research and Development and does important scientific work that is completely separate from the policymaking programs at EPA. Its placement is by design in order to ensure independent and objective assessments on hazardous chemicals that pose serious risks to Americans. The output of this office is not just important for federal policymaking, but IRIS assessments and associated toxicity values are used by state environmental and public health agencies, as well as community groups, to assess and address local risks to public health. This scientific expertise guides action that is essential to protect public health nationwide. It should be incorporated into and relied upon to set health-protective standards as EPA has done for years, rather than suddenly questioned in a rulemaking, for this or any other individual source of toxic air pollution.

Data on ethylene oxide released by the National Air Toxics Assessment (NATA) in 2018 revealed that the chemical is significantly contributing to higher cancer rates in areas surrounding chemical manufacturers and sterilizers using the chemical across the country. Just last week, the EPA issued its findings from air monitoring outside of the Sterigenics facility in Willowbrook, Illinois that was shut down by the state, comparing emissions before and after the shutdown. The monitors revealed levels 90 percent lower at the sites closest to Sterigenics, revealing the direct relationship between the facility’s operations and ethylene oxide levels. The systematic review conducted by IRIS evaluated the toxicological and epidemiological evidence available on the chemical and determined that it was carcinogenic to humans, leading to an increased risk of cancer of leukemia, lymphoma and breast cancer in women. The EPA should be taking swift action to issue ethylene oxide emissions standards to protect the over 100 communities across the country found to have cancer risk levels above the acceptable level of 100 in 1 million, as the Clean Air Act directs. The last thing communities exposed to ethylene oxide need is for EPA to try to ignore the science that has identified the problem.    

The chemical industry has attempted to undermine the work of the IRIS program time and time again, and there is now concern that the EPA itself is working to delay or halt IRIS assessments already underway, according to a recent GAO report. There is absolutely no good reason or time to question the agency’s own peer-reviewed science on ethylene oxide, which is robust and well-supported with substantial, independent evidence. In order for the EPA to meet its mission to protect human health and the environment, the EPA must rely on IRIS for its evaluations of the best available science and issue standards that best protect communities exposed to the highest emissions and associated health risks.

Thank you.

Photo: Roy Luck/Flickr

1,399 Endangered Species Latest Casualty as David Bernhardt’s Siege on Science Continues at Interior Department

The San Joaquin kit fox (Vulpes macrotis) is but one the 1,399 species whose population is at risk of being lost due to pesticide exposure. Photo: USFWS/Flickr

On Thursday, March 28, the Senate will hold a hearing to advance David Bernhardt’s nomination for Secretary of the Interior. This is not good news for the Department of the Interior, its federal scientists and their work, or the people, public lands, and endangered species that are directly affected by the agency’s decisions.

Over the past two years, Bernhardt has played a prominent role in sidelining science in policy decisions at the Department of the Interior (DOI), first as assistant Interior Secretary and then as acting secretary following Zinke’s resignation in December. We documented these attacks on science in our December report, Science Under Siege at the Department of the Interior, and continue to monitor other anti-science activities that have taken place under Bernhardt’s watch.

The Senate should do its due diligence to ensure that Bernhardt is held accountable for these attacks on science, including at Thursday’s hearing. The American people deserve better than what Bernhardt has demonstrated so far as a DOI leader: a failure to address climate change, diminishment of public lands, silenced federal scientists, and further losses of threatened and endangered species across the country.

How egregious are Bernhardt’s attacks on science? Here are three that stand out, including one that just came to light.

Bernhardt puts more than 1,200 endangered species at risk

On Wednesday, March 26, the New York Times reported that Bernhardt suppressed a scientific report on risks to endangered species. Documents obtained via a Freedom of Information Act (FOIA) request provided evidence that Bernhardt’s decision to block the release of a report documenting the threats of three pesticides to endangered species was heavily influenced by the industries who produce those pesticides.

The Cape Sable seaside sparrow (Ammodramus maritimus mirabilis) sits peacefully among prairie grasses in this photo. The bird species is highly sensitive to indirect spread of pesticides such as chlorpyrifos – scientists say that the species population is in jeopardy. Photo: Brandon Trentler/Flickr

For the report, scientists at the DOI’s Fish and Wildlife Service (FWS) investigated the risks from three pesticides: chlorpyrifos, malathion, and diazinon. The scientists found that two of the pesticides, malathion and chlorpyrifos, were so toxic that they jeopardized the existence of more than 1,200 endangered species. Currently, 1,663 species are listed as threatened or endangered meaning that these two pesticides alone could be a major culprit behind the decrease in these species’ populations.

These results came to the attention of Bernhardt, who arranged several meetings with top officials at FWS. The report was never released and instead a process was put in motion changing the method by which FWS, the Environmental Protection Agency (EPA), and the National Marine Fisheries Service (NMFS) assess chemical threats to endangered species. Specifically, scientists would no longer be able to consider the indirect—but very real—effects of pesticide exposure to endangered species (e.g., pesticide drift in air or water, contamination of food sources).

The chemical industry has long advocated that federal scientists not consider indirect impacts of pesticide exposure when determining threats to public and environmental health. However, FWS staff noted in their report that there are “Few limits on labels for when and where these pesticides can be used so exposure can be widespread,” and that “These pesticides have been found far from sites of application.”

Staff point to the kit fox as an example of an endangered species population that was nearly wiped out due to indirect impacts. This is because the fox’s food sources were contaminated by pesticides from farming practices in the San Joaquin Valley. If industry changes the process by which risk is assessed biological opinions are formed, farms will no longer be considered as an indirect source of pesticide exposure.

If indirect impacts do cause widespread harm to endangered species, it would not be accounted for and the US would be at risk of losing hundreds of endangered species.

The bald eagle was chosen June 20, 1782 as the emblem of the United States. Bipartisan support of the Endangered Species Act and the listing of the bald eagle as endangered under this legislation helped increase the emblematic species population to a healthy number. Source: Ross Eliott/Flickr

Bernhardt restricts use of science at Interior Department

In one of the most egregious attacks on science that UCS has documented to date, Bernhardt signed an order in September 2018 that immediately restricted the use of science at DOI. The order, known as the “Promoting Open Science” order, requires scientific data to be made publicly accessible. However, some data cannot be made accessible to the public for legal reasons because the release of such data can endanger individuals, rare and threatened species, and culturally or religiously important sites.

The order also requires data used in policy decisions be reproducible. This in effect can exclude important contributions from older studies where raw data is inaccessible. Therefore, the number of scientific studies that can be used to inform policy decisions at the DOI will inevitably go down. Future scientific studies by DOI agencies are likely to be restricted in their scope and methodology, and the order may deter outside scientists from working with DOI agencies, out of fear that confidential information could be released.

This order means science will take a back seat in DOI policies, increase the opportunities for outside influence on agency decisions, and result in policies that are not informed by the best available science, threatening the health and safety of the public and our environment.

This beautiful vernal pool is filled with the yellow wildflower commonly known as Contra Costa Goldfields (Lasthenia conjugens). The endangered wildflower species is one that is at risk of being lost likely because pollinating insects that it depends on are being killed by toxic pesticides. Photo: Kevin Bertolero/Flickr

Bernhardt silences climate change science

On December 22, 2017, Bernhardt quietly issued Secretarial Order 3360, which rescinded multiple policies on climate change and conservation. These changes undercut the Department of Interior’s (DOI) ability to fulfill its mission of conserving and managing the nation’s natural resources.

The order revokes a Departmental Manual chapter on climate change, a Departmental Manual chapter on landscape-scale mitigation policy, a Bureau of Land Management (BLM) manual section on mitigation, and a 2016 BLM handbook on mitigation. The order additionally directs the head of BLM to reassess the BLM Draft Regional Mitigation Strategy for the National Petroleum Reserve in Alaska and directs BLM to reissue a separate Bush-era guidance on offsite mitigation.

The now-rescinded policies had directed the DOI to take the threat of climate change and its foreseeable effects into account when making decisions. For example, the Departmental Manual chapter on climate change directed that the DOI “will use the best available science to increase understanding of climate change impacts, inform decisionmaking, and coordinate an appropriate response to impacts on land, water, wildlife, cultural and tribal resources, and other assets.”

Climate change is already having impacts across the US. Ignoring and failing to plan for its current and future effects puts America’s public lands, wildlife, and people at unnecessary risk.

Bernhardt is not qualified to run a science-based agency

The Department of the Interior should rely on science to make the best decisions for America’s parks, wildlife, and people. Time and time again, we have seen Bernhardt sideline science in critical DOI decisions. (He is also riddled with extensive conflicts of interest). There is no doubt that he would continue his siege on science if he is confirmed as Interior secretary.

Bernhardt is clearly unqualified to lead a science-based agency that makes decisions that affect so much of the country. Let’s hope the Senate agrees.

Taking Science Out of Air Pollution Protections

This post originally appeared on Scientific American

You and I enjoy cleaner air thanks to air pollution standards based on science. But now that could change. Last week, science advisors to the US Environmental Protection Agency (EPA) drafted a letter criticizing the agency’s use of science to set ambient air pollutant standards. This is the latest development in the EPA’s process to update the health protective standards for particulate matter and ozone—the two air pollutants most responsible for early death and sickness in this country.

The letter, along with previous actions taken by EPA leadership and the agency’s clean air science advisors, raises alarm bells for anyone who understands the EPA’s careful consideration of science in air pollution policy. These developments risk unraveling the methodical process that, for decades, has effectively ensured we have science-based air pollution standards and steady reductions in air pollution—and much of these actions aren’t making many headlines. Here’s a rundown of recent changes to ambient air pollution standard updates and why all of this matters.

EPA leaders and CASAC are breaking with long-standing policy and practice that ensures science informs air pollution decision making.

We must first look at how the process is supposed to work. National ambient air pollution standards set by EPA must be based on science and at a level that “protects public health with an adequate margin of safety.” This means the EPA must only consider the science that answers the question of what protects all people, including sensitive populations, such as the elderly, children, and those with lung and heart diseases.

In assessing the science and the standards, the EPA relies on expert advice from the seven member Clean Air Scientific Advisory Committee, which has always been supplemented by a panel of additional experts on particular pollutants under review. These experts publicly debate and review the agency’s science assessment, that is, the document that summarizes all the relevant science we have on a relationship between an air pollutant and its health and welfare impacts. This crucial document informs how the EPA thinks about the risks posed by setting the standard at different levels and what the impact of such a policy will be more broadly. (In EPA speak, the Integrated Science Assessment informs the Risk and Exposure Assessment and the Policy Assessment.) CASAC, with help of the review panel, will then make an official recommendation for what level of air pollution will protect public health with an adequate margin of safety.

Together these three documents, and CASAC’s recommendation, go to the EPA administrator who will ultimately set the standard. Thus, CASAC and the scientific assessment are essential because they inform this entire process and ensure that the EPA is basing policies on the best available science. This is why changes to the process matter.

The Trump Administration is cutting science out of air pollution policy.

These challenges didn’t start with this month’s letter. Back in October, the EPA nixed the particulate matter review panel and failed to convene an ozone panel. As explained above, these panels of experts are meant to supplement the expertise and perspectives of CASAC and have played a crucial role in informing air pollution standards since the first CASAC reviews in 1978. Also last October, EPA leadership announced an aggressive timeline that would complete reviews of both particulate matter and ozone by 2020. This is the equivalent of lightning speed in the science policy world, and meeting such a timeline will mean fewer public meetings and fewer opportunities for the public and experts to provide input. A faster timeline, combined with a lack of pollutant panel to inform the standard, means that far less science can inform the air pollution standard than has in the past.

Before all this, a memo from then-administrator Scott Pruitt laid the groundwork for cutting science from air pollution standard updates. We are now seeing those recommendations come to fruition.

EPA doesn’t have the science advice it needs.

Without a review panel, science advice is left to the seven-member CASAC. To be clear, no seven people, even if top experts, would have the breadth and depth of expertise needed to fully review the EPA’s science assessment. The beast of a document draws from diverse fields, from epidemiology to toxicology to clinical medicine to ecology. And this isn’t just my opinion—members of the current CASAC themselves said in the draft letter that that they don’t have the needed expertise to conduct the review for particulate pollution.

On top of this hamstringing of the agency’s science advice, EPA already shook up its advisory committees. The agency kicked off its advisory committees anyone who had a current EPA grant. Paradoxically, EPA claimed this represented a conflict of interest, while scientists working directly for regulated industries did not. Following this new policy, the agency removed several independent experts from CASAC, replaced them with people from state and local regulatory agencies, and chose a chairperson with fringe scientific views who consults for the American Petroleum Institute. The resulting committee is missing representatives from key scientific disciplines such as epidemiology. It is difficult to argue that this new CASAC represents the top independent experts the scientific community has to offer.

Members of a flawed committee recommend upending EPA’s approach to science.

The letter from CASAC chair Louis Anthony (Tony) Cox Jr. released this month essentially trashes the EPA science assessment, inexplicably calling the lengthy, exhaustingly referenced document “unverifiable opinion” and claiming that it fails to follow the scientific method. Dr. Cox calls for a brand new approach, asking the agency to throw away the long used and scientifically backed weight-of-the-evidence framework in order to determine the health effects of air pollution.

It is hard to overstate just how jarring this is. With little explanation or scientific support, the CASAC chair is suggesting overturning a framework that has been supported by 11 past CASAC committees and 138 top scientists. Alarmingly, the committee could not come to consensus on a long-held scientific understanding: that fine particulate matter is linked to early death. Many thousands of studies in the past several decades have built evidence demonstrating this link. The fact that CASAC appears to be now renegotiating this long-held mainstream scientific understanding is breathtakingly backwards.

On March 28, CASAC will hold its next meeting to discuss the draft letter and the committee’s final recommendations on how EPA should finalize its science assessment. At this point, it is unclear what the committee will collectively decide, and whether they will have any consensus comments for the EPA. What is clear is that the EPA’s process for updating air pollution standards is changing in ways that threaten the agency’s very use of science to protect the public from air pollution. Regardless of what happens on March 28, and in this review process for particulate matter, we are witnessing this administration chip away at the foundation for air pollution protections in this country. And that puts us all at risk.

Pages