UCS Blog - The Equation (with images) - Latest 2

A Tribute to Kenneth J. Arrow, Nobel Prize Winner and a Giant Among Economic Thinkers

Earlier this week, Kenneth Arrow, a Nobel prize-winning economist passed away at the age of 95. Dr. Arrow was a prolific thinker, truly a giant among economists. His research spanned areas as diverse as welfare theory, innovation, labor market networks, public health, and risk. Many extensive obituaries to Dr. Arrow have been written in major newspapers. He was also a great champion of ideas and values we hold dear at the Union of Concerned Scientists.

Credit: Linda A. Cicero / Stanford News Service

The incredible breadth of Ken Arrow’s work

As students of economics will attest, it is difficult to find a field of economics that hasn’t been influenced in some way by Dr. Arrow’s thinking. As a graduate student, I recall being introduced to ‘Arrow’s Impossibility Theorem’ and discussing its real-world implications for voting and social choice. That’s also when I first read about his work on learning curves and its bearing on technological progress. In a seminal paper, titled The Economic Implications of Learning by Doing, he wrote that:

“Learning is the product of experience.” And “The role of experience in increasing productivity has not gone unobserved, though the relation has yet to be absorbed into the main corpus of economic theory.”

Today these insights can help explain some of the extraordinary decline we’ve seen in the costs of renewable energy.

Right until the end, he was engaged in cutting-edge work, including coauthored research on the risks of climate change, the social cost of carbon, appropriate discount rates for decisions with long time horizons, and public health.

In a recent paper on the social cost of carbon, he and his coauthors argued that, “Costs of carbon emissions are being underestimated, but current estimates are still valuable for setting mitigation policy.” (An insight which is highly relevant for an upcoming hearing on the social cost of carbon in the House Committee on Science, Space and Technology!)

To get a fuller sense of the amazing breadth of Ken Arrow’s work, take a look at these resources:

Engagement with the Union of Concerned Scientists

Dr. Arrow shared common interests with UCS, especially in recognizing the threat of climate change and the need for swift, cost-effective solutions. He also had a lifelong commitment to issues related to peace and security.

As far back as 1997, he was a signatory to the World Scientists’ Call for Action at the Kyoto Climate Summit, a statement initiated by UCS. The statement included this exhortation to world leaders, one that resonates poignantly even today:

We, the signers of this declaration, urge all government leaders to demonstrate a new commitment to protecting the global environment for future generations. The important first step is to join in completing a strong and meaningful Climate Treaty at Kyoto. We encourage scientists and citizens around the world to hold their leaders accountable for addressing the global warming threat. Leaders must take this first step to protect future generations from dire prospects that would result from failure to meet our responsibilities toward them.

Closer to home, he signed on to a 2015 UCS-sponsored letter to California legislators urging the adoption of strong climate and clean energy policies to help ensure a reduction in the state’s global warming emissions of 80 percent below 1990 levels.

In December 2016, Dr. Arrow was one of over 5,500 scientists who signed An Open Letter to President-Elect Trump and the 115th Congress, calling on them to ensure that science continues to play a strong role in protecting public health and well-being.

I had the honor of meeting him briefly when he attended a UCS reception at the American Economic Association meetings in January 2009. He was gracious and encouraging of our work to engage more economists in designing and advocating for solutions to climate change.

A social scientist of the highest order

Dr. Arrow took the charge of a social scientist with great seriousness. He received the highest honors in the economics profession, including the Nobel Prize and the National Medal of Science, and was a widely published academic researcher. But he was no ivory tower academic.

Ken Arrow engaged widely and deeply with the real and urgent problems of the day. What’s also striking is the strong global perspective he brought to his work. He was a lead author for the Intergovernmental Panel on Climate Change (IPCC) Second Assessment Report in 1995. He was also a founding trustee of Economists for Peace and Security, an organization of economists, other social scientists, and citizens concerned about issues of peace, conflict, war, and the world economy.

In an email tribute, Geoffrey Heal, UCS Board member and the Donald C. Waite Professor of Social Enterprise at Columbia Business School wrote this:

Ken Arrow dominated the social sciences for half a century, just as in their eras Newton and Einstein dominated the physical sciences. In addition to being the giant on whose shoulders we all stand, Ken was a charming, modest, friendly and unassuming person. I knew him for over half a century, and will miss his friendship, his encouragement to think differently and his unparalleled intellectual insights.

Congress Does Industry’s Bidding by Cutting Public Safeguards

The past month has not been kind to environmental and public health protections. A bevy of science-based rules are now on the chopping block thanks to the congressional sleight-of-hand called the Congressional Review Act (CRA), which allows a simple majority in Congress to undo provisions issued within the final six months of the previous administration.

Right now, industries clearly feel empowered to try to roll back public safeguards in the name of profits. But the American public will pay a steep price from the erosion of these protections and science is being sidelined in the process.

Promoting water pollution

For a case in point, look no further than the recent overturning of the Stream Protection Rule issued by the Department of Interior’s Office of Stream Mining Reclamation and Enforcement (OSMRE). This science-based rule was designed to protect streams in the United States, including headwater streams, from the often devastating impacts of pollution with mining waste and debris.

All told, this rule would have improved the quality of some 263 miles of streams downstream of mines each year, the benefits of which would have been felt by nearby communities. But a CRA measure to overturn it recently passed in both the House and Senate and was signed on February 16 by President Trump.

The Stream Protection Rule was a commonsense safeguard based upon clear evidence that, once a waterway’s flow has been disturbed, it is very difficult to restore it to its original condition. This is bad news for all of the native plant and animal species in the ecosystem and for communities and farmers downstream who will reap fewer of the natural benefits that riparian buffers provide, like reduced flooding, filtration, and increased groundwater recharge.

Additionally, one of the mining techniques—known as long wall mining—extracts coal underground, which often causes sinkholes that can damage structures aboveground. This activity has led to all sorts of calamities including a cracked dam, homes and businesses with damaged foundations, structural problems with the sources of groundwater property owners rely upon, and even the disappearance of an entire lake once used for boating and fishing in a Pennsylvania state park.

Representatives Bill Johnson of Ohio and Evan Jenkins and David McKinley of West Virginia were among the sponsors of the legislation to revoke the Stream Protection Rule. Between them, they have taken more than $1 million in political contributions from the mining industry, which will no doubt benefit mightily from the removal of this commonsense check on their operations. The talking points of the National Mining Association and America’s largest mining company, Murray Energy Company, are also echoed in the representatives’ misinformed statements about the rule.

By overturning this protection, the bill’s sponsors are ensuring that residents across America will continue to see their water sources, their homes, and their environment degraded. And the message that their elected officials are sending is loud and clear: profits over people.

Sadly, several other efforts to rescind commonsense protections are also now underway. Here are some we’re watching closely:

Communities like Galena Park in east Houston need stronger health and environmental policies to protect residents from toxic air pollution and potential chemical release from nearby chemical facilities. Instead, Oklahama Rep. Markwayne Mullin is using the CRA to weaken those protections for the benefit of industry.

Undermining chemical safety

Oklahoma Representative Markwayne Mullin, who has received more than $410,000 from the oil and gas industry during just two terms in office, has introduced legislation to remove a rule issued by the EPA last year designed to improve safety at facilities that use or store large amounts of dangerous chemicals and to further protect first responders and fenceline communities. Major industrial facilities, including oil and gas companies, have been vocal in their opposition to this rule, and Mullin has become the elected mouthpiece of those entities.

The updated EPA Risk Management Plan (RMP) rule is a commonsense, science-based provision designed to regulate industrial facilities all across America that release toxic chemicals. On average in recent years, approximately 150 catastrophic accidents have occurred annually at these facilities, posing often-grave risks to the workers and to the neighboring communities.

There are a significantly greater percentage of African Americans, Latinos, and people in poverty living near these facilities at higher risk for exposure to chemical releases. As noted in the 2016 report written by UCS and Texas Environmental Justice Advocacy Services (t.e.j.a.s), Double Jeopardy in Houston: Acute and Chronic Chemical Exposures Pose Disproportionate Risks for Marginalized Communities, residents in Houston communities with RMP facilities have a higher risk of developing or worsening lung diseases such as asthma and chronic bronchitis due to exposure of high toxic concentrations of air pollutants including harmful chromium compounds.

Improvements to the RMP rule would have helped to make facilities safer for surrounding communities, reduced the number of catastrophes, and ensure that first responders were fully informed and protected. While the original rule will remain in place even if the amendments are rolled back using the CRA, the status quo has not been enough to fully protect Americans from toxic chemical exposures, and people of color and in poverty will continue to bear the brunt of health impacts from future accidents and spills at these facilities.

The CRA bill that would nullify the EPA’s rule is still pending a vote in the House, and industry representatives are lobbying hard for its passage.

Fostering air pollution and hastening climate change

The Bureau of Land Management’s (BLM’s) Methane and Waste Prevention Rule was issued to update the Carter administration regulation that governs how oil and gas is extracted on federal land. The update would have reduced some of the most dangerous impacts of fracking for natural gas extraction, including leaks, venting, and flaring, which was especially timely given that reductions in methane pollution are needed to get us on track to meet emissions standards.

Two of the sponsors of the legislation that would eliminate this rule are Utah’s Rob Bishop and Wyoming’s John Barrasso, who have received over $1 million in campaign contributions from the oil and gas industry over the course of their political careers. This effort is only the newest in a succession of attacks on this rule since it was first proposed. Recently released emails from the Oklahoma Office of the Attorney General (yes, that of our newly confirmed EPA administrator, Scott Pruitt) reveal 2013 correspondence with a fracking company, Devon Energy Corporation, discussing how they both planned to meet with the Office of Management and Budget (OMB) to convince them to “completely do away with the present thrust” of the BLM’s methane waste rule.

Pulling the methane rule will result in the continued release of methane pollution, which perhaps not surprisingly occurs at the highest levels on tribal lands in Rob Bishop’s state of Utah—and John Barrasso’s state of Wyoming has one of the highest methane emission levels on federal lands, according to this 2015 report. Remember when a 2,500 mi2 cloud of methane floated over the borders of Utah, Colorado, New Mexico and Arizona in 2014? We can expect more of that phenomenon to occur without these rules. ‘

And what does that mean for the livelihoods of Wyoming and Utah residents and all Americans? Well, increases in methane pollution can lead to increased ground-level ozone levels as well as other hazardous air pollutants like benzene, formaldehyde, and hydrogen sulfide, which can trigger asthma and even cancer. Not only will air quality continue to get worse in areas of highest methane emissions, but we will all experience the impacts associated with a changing climate thanks to excessive and irresponsible release of the most potent greenhouse gas.

This CRA bill still requires passage in the Senate to move onto the President’s desk.

Vitally needed: checks and balances

Along with rollbacks to regulations allowing polluters to freely pollute, industry is referencing another page from its playbook to chip away at transparency measures designed to keep companies accountable for their business dealings. The President signed a CRA resolution last week to roll back a Securities and Exchange Commission rule requiring that oil and gas companies disclose payments to foreign governments. Ohio representative Sherrod Brown remarked, “This kind of transparency is essential to combating waste, fraud, corruption and mismanagement.” Secretary of State and former ExxonMobil CEO, Rex Tillerson, was a vocal critic of this rule before his confirmation and can check that off of his to-do list now that he’s a cabinet member.

The first few weeks of Trump’s presidency have affirmed that money talks, and that power can be bought and used to further maximize profits. We have seen a corporate takeover of our government, as several individuals with strong industry ties were nominated and confirmed for key agency leadership positions; the President has issued legally questionable executive orders, one of which requires that agencies repeal two regulations for every one that it issues (with the intent of freezing regulation and allowing industry to get away with business as usual); and Congress is hastily working to nullify a slew of Obama-era regulations that would prevent industry misconduct, using the Congressional Review Act in an unprecedented fashion.

While industry is ready to profit from CRA rollbacks both here and abroad, a regulatory freeze, and industry-friendly cabinet appointments, everyday Americans will be missing out on unrealized health and safety benefits. And members of Congress who are using techniques like the CRA to undermine public health and safety are making a grave mistake that will surely catch up with them as Americans come to see the effects of these misguided rollbacks.

Our democracy is built on a foundation of checks and balances. Among these are the need for governmental protections to place a vital check on industry in order to keep our water, food, and environment healthy and our workplaces and our children safe. Unfortunately, in the current political environment, industry is influencing decision makers with political contributions, and effectively making many members of Congress beholden to them. The result: elected officials spouting off specious industry talking points and designing policies that leave industry excesses unchecked. When industry interference prevents government from making decisions based on science, it undermines our democracy and the public suffers.

We will continue to work to ensure that agencies have the freedom to fulfill their critical missions, and we will be ready to hold industry accountable for noncompliance or misdeeds.

 

Photo: Jack Pearce/CC BY-SA 2.0, Flickr Photo: Yvette Arellano/TEJAS Photo: Tim Evanson/CC BY-SA 2.0, Flickr

Overpopulation, and a Movie that Definitely Won’t Get the Oscar

As Oscar Night approaches, I’ve gotten to thinking about the movies I saw last year—not just the good ones, but a bad one too. It’s Inferno, which seemed to have everything going for it, but has sunk into cinematic oblivion with scarcely a trace. Why?

Before I saw it last fall, I thought it had all the elements that would make lots of Americans like it—including me. It stars Tom Hanks, the actor who would definitely be America’s Sweetheart if he weren’t so old and so male. It’s directed by Ron Howard, one of Hollywood’s most respected directors. Its title and underlying theme come from Dante’s description of Hell—seven centuries old but still unsurpassed.

Dante—a portrait by Andrea del Castagno, in the Uffizi Gallery, Florence. Source: Web_Gallery_of_Art, Wikimedia.org

And it’s based on the best-selling novel by best-selling author Dan Brown of DaVinci Code fame. Put those four together, and how could we fail to like it? For that matter, how could I fail to like it? (OK, I’m not a Dan Brown fan, but Hanks, Howard and Dante are all favorites of mine, so three out of four…)

Well, even with all that going for it, there’s no way it’ll be mentioned Sunday night. In fact I suspect that Tom Hanks and Ron Howard would just as soon we forget they ever were associated with it. (Not sure about how Dan Brown or Dante are feeling). The critics’ consensus, as summarized by the Rotten Tomatoes web site, is “Senselessly frantic and altogether shallow, Inferno sends the Robert Langdon trilogy spiraling to a convoluted new low.” Ouch! And even more painful, Hollywood-wise, it made only $34 million at the box office. I.e., a total flop.

How could it fail so badly? I thought briefly that it might have to do with the plot and the villain. (Spoiler ahead, although frankly it’s so far past its sell-by date that this can’t make it worse.) Inferno’s evil genius turns out to be a millionaire who thinks the world’s fundamental problem is … overpopulation. Through TED-like talks he builds up a cult of Malthusian followers who conspire with him to kill off half the world’s people for the sake of preserving nature.

So, was that the problem? Was seeing a twisted kind of environmentalist as the epitome of Evil just too much for American audiences to take? Is our fear of population growth so strong that we refuse to accept any negative portrayal of that fear? Just too much cognitive dissonance?

Nahhh…..I don’t think so. It’s easy for us intellectuals to overthink pop culture, and in this case I think there’s a simpler explanation. It’s just a bad movie. And as Dante’s contemporary William of Ockham taught us, there’s no need to come up with a complicated explanation when a simple one will do just fine.

So, on Sunday night I won’t be regretting the fact that Inferno’s not in the running for Best Picture. Personally I’m rooting for Hidden Figures. It had me right from the opening scene in which a young African-American girl is walking down a lane counting “….eight, nine, ten, prime, twelve, prime, fourteen, fifteen, sixteen, prime…”

Just the nerd version of sentimentality? Sure, I admit it. But it’s also a great movie. And nowadays science can use all the help it can get from pop culture, so I’m really hoping it wins.

Marginalizing Transgender Students Weakens Science and Diminishes America

Yesterday, the Trump administration turned back the clock on civil rights by giving schools more rights to discriminate against and bully transgender kids, some of the most vulnerable people in our society. The New York Times reports that the withdrawal of protections for transgender students comes at the behest of Attorney General Jeff Sessions over the objections of Education Secretary Betsy DeVos.

A student celebrates after the Supreme Court struck down the so-called Defense of Marriage Act in 2013. Photo: Michael Halpern

The move comes amid recent research demonstrating that suicide among lesbian, gay, bisexual, and transgender youth decreased in the wake of state court decisions that formalized marriage rights for all Americans. It makes intuitive sense: actions that give an individual the opportunity to live a full life make it more likely that the individual will stay invested in that life. Legitimacy matters.

Science, like any creative endeavor, works best when people of different backgrounds are at the table. But LGBT people still face significant barriers to participation in the scientific enterprise. A recent American Physical Society report found that thirty percent of transgender scientists “characterized the overall climate of their department or division as ‘uncomfortable’ or ‘very uncomfortable.'”

The action hurts and marginalizes transgender kids. It also undermines the promise of our public education system, which should welcome, not exclude, and give everyone an opportunity to learn and thrive. We all suffer when kids are prevented from reaching their full potential because they feel unsafe.

The most memorable part of the attorney general’s confirmation process involved the silencing of Elizabeth Warren, who was attempting to read a letter Coretta Scott King wrote in 1984 when Sessions was up for a federal judgeship. “It is only when the poor and disadvantaged are empowered that they are able to participate actively in solutions to their own problems,” King wrote, in reference to her concerns about Sessions’ willingness to defend the voting rights of black Americans.

I fear that this action is the first of many at the Department of Justice with the potential to weaken science and diminish America. The guidance makes some of the most bullied kids in America less safe.

The United States does not have a good history of leaving the protection of civil rights to the states. But for now, it is up to state and local governments and school boards to guarantee the ability of all students to pursue an education so that our country can continue to benefit from the contributions of all. Please take a moment today to weigh in with your local education officials and let them know that you want them to secure basic protections for all students so that every kid has the chance to thrive.

Governor Dayton Must Step Up to Protect Energy Consumers

Two bills are making their way through the Minnesota legislature that would hack away at the Minnesota Public Utilities Commission’s (MPUC) authority to protect consumers. Given the unnerving level of bipartisan support these bills are receiving in the legislature, it’s time for Governor Dayton to step up and protect consumers, as well as Minnesota’s long legacy of clean energy achievements.

Setting a dangerous precedent in Minnesota’s oversight of utility monopolies

Perhaps the most dangerous bill in the bunch (HF113/SF85) would legislatively approve Xcel’s proposed natural gas plant to replace two retiring units at the Sherco coal-fired power plant in Becker, Minnesota. The bill would strip the Minnesota Public Utilities Commission’s (MPUC) traditional role of reviewing plans to ensure investments are in the best interest of consumers.

While Xcel included the proposed natural gas plant in its latest integrated resource plan, the MPUC declined to approve it, expressing concern that other alternatives might be more beneficial to ratepayers over the long term. Significant doubt remains whether the investment makes sense, but that debate will be silenced if this bill becomes law. The House voted to pass the bill on February 9, and last week the Senate voted to pass a similar bill as well.

The House will now take up the bill to reach a compromise. Governor Dayton has signaled his support for the bill despite the risks to ratepayers, but he should reconsider.

Despite the Governor’s good intentions to help protect the local Becker economy,  this bill sets a dangerous precedent for future utility investment decisions. What happens next time the MPUC declines to approve a proposed billion dollar (or more) investment by Xcel? Do they come back to the legislature for another blank check? Protecting ratepayers from paying for bad investments is a core function of the MPUC. If this bill becomes law, Minnesota ratepayers face an uncertain and potentially costly energy future.

Closing the door on rural ratepayers

Another legislative proposal, HF234/SF141, would remove the MPUC’s authority to resolve disputes between Minnesota’s rural electricity cooperative utilities and their members. The weak rationale for this proposal suggests that somehow cooperative members don’t need this dispute resolution venue because they have local control over their utilities.

In reality, disputes do occur between electric cooperatives and their members, and without an objective arbitrator to resolve them, the co-op holds all the cards. This bill is particularly directed at disputes that have arisen over the exorbitant fees that cooperatives are charging members to connect solar PV systems to the grid.

These fees are an attempt by cooperative managers to maintain the status quo and only serve to slow Minnesota’s transition to cleaner, lower-risk energy sources. Maintaining the MPUC’s role as arbitrator of these disputes provides protections for cooperative ratepayers as well as Minnesota as a whole.

Commission’s role

The MPUC’s role is to protect and promote the public’s best interest in safe, adequate, and reliable utility service at fair and reasonable rates. This is done by providing much needed independent and comprehensive oversight and regulation of utilities. Unfortunately, these bills seek to erode the MPUC’s mission, and authority.

Governor Dayton can’t have it both ways. He must stand by his word not to accept any bill that limits or weakens the Commission’s authority to protect the interests of Minnesota’s energy consumers.

Creative Commons/Mulad (Flickr)

Why You Can’t Buy a Tesla in Connecticut (and 5 Other States)

The state of Connecticut is a progressive state, with a strong track record of support for laws and policies that will reduce global warming emissions and a goal of putting over 150,000 electric vehicles on the road by 2025.

Given the policy commitments of the state of Connecticut, one might assume that Connecticut would be a place that would welcome an innovative, important business like Tesla, the largest manufacturer of electric vehicles in the United States. And given the significant fiscal challenges that Connecticut faces, one might think that Connecticut would be excited to see Tesla operate new stores within the state, bringing jobs and tax revenue.

But in fact, Tesla is legally prohibited from operating its Tesla stores in Connecticut.

Under Connecticut’s dealer franchise law, and under the law of many states throughout the country, automobiles may only be purchased through independent car dealerships. Tesla’s cars are sold directly from the manufacturer, which mean that Tesla stores are not welcome in Connecticut.

The problems that Tesla has faced with automotive dealers and state dealer franchise laws represent a combination of unintended consequences, special interest influence, and the challenges of developing new technologies in marketplaces dominated by entrenched interests and outdated laws. The Tesla wars are also a part of a broader story of how changes in technology are impacting laws and regulations governing transportation in the United States.

In this blog post, I want to explore some of the key questions raised by the battle over Tesla. In particular:

  • Why do we have dealer franchise laws?
  • Why doesn’t Tesla sell their cars through franchised dealers?
  • Why do some states allow Tesla stores and others do not? (Hint: it depends on the meaning of ‘its’).

In part 2 of this post, I will look at some of the policy arguments that have been made by auto dealers, by Tesla and by economists on dealer franchise laws.

  • What is the justification for laws banning Tesla stores?
  • What does the evidence suggest about dealer franchise laws?
  • What are the consequences of Connecticut’s ban on Tesla stores?
Why do we have dealer franchise laws?

The car dealership model as we know it today arose in the 1920s and 1930s, as first General Motors, and then eventually all of the “Big Three” American automakers chose to license the rights to sell their cars to independent dealers, rather than selling the cars directly to consumers.

The independent dealership model worked because it allowed both parties to focus on core competencies: the manufacturers could focus on making the best cars possible, while independent dealers made the inroads into local communities that allowed them to most efficiently sell the cars directly to consumers.

From the beginning, one challenge in the independent dealership model is the obvious power imbalance between the “Big Three” automakers who dominated automobile manufacturing, and the thousands of independent dealerships that were licensed to sell their vehicles. Stories abounded of auto manufacturers exploiting their superior market position to gain unfair advantages on independent dealers. For example, manufacturers could force independent dealers to purchase cars that they didn’t want as a condition of maintaining their relationship, or terminate the franchise relationship at will without cause, or coerce profitable dealerships into selling their business at below-market rates.

Beginning in the 1930s and accelerating greatly in the 1950s, legislatures in all 50 states passed a series of laws, known collectively as dealer franchise laws, which were intended to protect independent dealers from abusive practices at the hands of vehicle manufacturers. Among other things, these laws prohibited the Big Three from owning licensed dealerships themselves, or selling cars directly to consumers.

The prohibition on direct manufacturer sales was intended to protect independent auto dealers from unfair competition from their own manufacturers. The classic concern addressed by the ban on direct sales from manufacturers is the independent car dealer who spends money, time and effort building a market for, say, Ford vehicles in a certain town, only to have Ford Motor company jump in and open up a rival direct from manufacturer store that undercuts the independent dealer on price and takes his market share.

By the 1950s when most of these laws were passed, the independent dealer model was so entrenched in the American car market that it was simply presumed that all auto manufacturers would have independent dealerships selling their cars, and that any direct manufacturer sales would necessarily be in competition with an independent dealership. Dealer franchise laws therefore did not contemplate the challenge posed by a company like Tesla, a company that refuses to sell its cars to independent dealerships at all and instead insists that all sales must be direct from the manufacturer itself.

Why doesn’t Tesla distribute through franchised dealers?

Tesla has adopted this policy because they believe that the traditional independent dealership model does not work for electric vehicles. According to Tesla CEO Elon Musk:

Existing franchise dealers have a fundamental conflict of interest between selling gasoline cars, which constitute the vast majority of their business, and selling the new technology of electric cars. It is impossible for them to explain the advantages of going electric without simultaneously undermining their traditional business. This would leave the electric car without a fair opportunity to make its case to an unfamiliar public.

Tesla points to the failure of Fisker and Coda as examples of electric vehicle start-up companies that failed because of their reliance on independent dealerships to sell a new technology. In addition, Tesla argues that because electric vehicles have lower maintenance costs than traditional cars, independent dealerships that make money off of service will always have an incentive to steer consumers away from electric vehicles. Tesla offers service for all of their vehicles for free.

Recent studies confirm that, with a few exceptions, most auto dealers in the Northeast are not making enough of an effort to sell electric vehicles. Between January and June of 2016, dealers in the Bridgeport to New York City metro area had 90 percent fewer EVs listed for sale than Oakland, when adjusted for relative car ownership. A recent report by the Sierra Club found that Tesla stores provide EV customers with far superior service, as Tesla was more likely to have EVs available to test drive, more likely to be knowledgeable about state and local incentives, and more likely to be able to correctly answer technical questions about charging EVs, than traditional car dealerships.

A Tesla store looks and feels more like an Apple store than a car dealership. They are placed in high volume, high traffic areas such as shopping malls. They have almost no inventory, as Tesla cars must be ordered individually from the manufacturer rather than sold on site. There is no haggling over price. And Tesla stores sell only Tesla products, including cars and batteries; with the recent merger with SolarCity, Tesla stores will soon sell solar panels as well.

Why do some states allow Tesla stores and others do not?

Over the past few years, courts and legislatures across the country have struggled with the question of whether and how to apply dealer franchise laws to Tesla stores. Some state courts, including Massachusetts and New York, have found that dealer franchise laws are only intended to apply to manufacturers that have licensed independent dealers, and do not provide a cause of action against Tesla stores. Other states, including New Hampshire and Maryland, have recently changed its law to permit Tesla stores through legislation.

States that currently ban Tesla stores include Texas, West Virginia, Utah and Arizona, in addition to Connecticut. Some states, including Virginia and Indiana, allow a limited number of Tesla stores. New Jersey proposed a regulation that would have banned Tesla stores in 2015, but then relented last year, amending the regulation to allow 4 stores in New Jersey.

Often the difference between a jurisdiction that permits Tesla stores and a jurisdiction that bans Tesla stores comes down to minute differences in statutory language. For example, until 2014 Michigan’s dealer franchise law prohibited auto manufacturers from “[selling] any new motor vehicle directly to a retail customer other than through its franchised dealer.”

The word “its” in the statute arguably suggests that the law only applies to manufacturers that have franchised dealers, and thus does not prohibit Tesla stores. But then a legislator allied to the auto industry slipped a provision into an unrelated piece of legislation removing the word “its” from the statute, and just like that, Tesla stores were banned in Michigan.

Beyond narrow questions of statutory interpretation, judges and legislators wrestling with these questions need to consider the purpose of dealer franchise laws. Are these laws meant to regulate a relationship that arose within the context of the independent dealer system? Or are these laws intended to mandate that the independent dealer system must be the only way automobiles are sold in the United States forever? If it is the latter, then the dealer franchise laws represent not only a ban on Tesla, but a ban on all innovation in distribution methods.

Can such a ban be justified? In part 2 of this post, I’ll explore some of the policy consequences of dealer franchise laws and the Tesla ban, for consumers, and for Connecticut.

Will New Mexico Join the Next Generation of Clean Energy Leaders?

More and more states across the country are redefining what it means to be a clean energy leader by doubling down on their commitments to deploy solar, wind, and other renewable energy sources. Now the New Mexico legislature wants to add their state to the growing list. Recently introduced legislation would increase New Mexico’s successful renewable electricity standard (RES) from its current level of 20 percent by 2020 to 80 percent by 2040. Adopting the measure would capitalize on the state’s tremendous renewable energy resources and deliver substantial economic, health, and environmental benefits to all New Mexicans.

A renewable energy economy is achievable and affordable for New Mexico

The New Mexico Wind Energy Center, located in the southeast part of the state, generates clean, renewable power for energy consumers. Photo Source: Oak Ridge National Laboratory

Introduced as SB312, the legislation builds on New Mexico’s current RES (also referred to as a renewables portfolio standard or RPS) and would require investor-owned utilities like PNM, Southwestern Public Service, and El Paso Electric to increase their supply of electricity from renewable energy sources to 80 percent by 2040.

Rural co-ops would have to achieve a slightly lower target (70 percent renewables by 2040).

While this newly proposed commitment is substantial, transitioning New Mexico’s economy to one powered primarily by renewable energy is certainly achievable. That’s because New Mexico is home to some of the best and most diverse renewable energy potential in the country, including vast untapped wind, solar, and geothermal resources.

A 2016 National Renewable Energy Laboratory analysis found that New Mexico’s economic renewable energy resource potential, which accounts for the renewables’ cost as compared with the typical regional cost of electricity, is more than 2.6 times total state electricity sales in 2015 (see figure). That means there is more than enough cost-competitive renewable energy resources available today to comply with the proposed targets that utilities have more than two decades to achieve.

Of course, New Mexico’s technical renewable energy resource potential far exceeds these economic potential estimates. As technology costs continue to decline, more and more of the untapped technical resource potential will also become cost-effective over time.

New Mexico’s Renewable Energy Economic Potential vs. Electricity Sales.
New Mexico has more than enough cost-effective renewable energy potential to achieve an 80 percent RES. The National Renewable Energy Laboratory estimates the state’s economic potential at more than 260 percent of total electricity sales in 2015.
Sources: Economic Potential from Primary Case 3a in NREL’s Estimating Renewable Energy Economic Potential in the United StatesElectricity Sales from U.S. Energy Information Administration’s New Mexico State Electricity Profile 2015.

 

Wind and solar costs, in particular, are falling rapidly. The most recent comparison of costs by the energy consulting firm, Lazard, shows new wind and solar to be cheaper than new fossil fuel generation, even without subsidies.

This trend is reflected in recent power purchase contracts for wind and solar projects in the region. For example, Southwestern Public Service signed a contract for a 140 megawatt (MW) solar project near Roswell for about 4 cents per kilowatt-hour (c/kWh). Similarly, reported costs for recent wind projects in the Southwest have been as low as 2.3 to 3.8 c/kWh.

For context, Lazard estimates the cost of power from a typical new natural gas combined cycle plant ranges from 4.8 to 7.8 c/kWh.

The proposed 80 percent by 2040 RES expansion ramps up gradually over time, with interim targets for public utilities of 35 percent in 2025, 50 percent in 2030, and 65 percent in 2035. That level of increasing targets affords utilities plenty of time to plan for new renewable energy development as older fossil fuel generators retire. What’s more, the legislation builds in consumer protections should compliance costs prove to be higher than anticipated.

New Mexico’s renewable energy transition is already delivering benefits

New Mexico’s current RES is already successfully driving new renewable energy deployment and delivering economic and environmental benefits throughout the state. Today, more than 1,500 MW of wind and solar power capacity is cranking out clean power for New Mexico’s energy consumers. The wind power development alone represent more than $1.8 billion in investments and provide up to $5 million annually in land lease payments for local residents.

Another 1,500 MW of wind and nearly 1,400 MW of solar are either under construction or in various stages of development in the state. When completed, these projects combined with those already operational will exceed the state’s current renewable energy targets. Further diversifying New Mexico’s power supply with additional renewable energy can provide much needed investment and tax dollars to local economies and the state government’s struggling budget coffers.

Combined, the wind and solar industries are supporting 4,000 to 5,000 good paying jobs in the state, and that number continues to grow. Earlier this month, Albuquerque-based solar manufacturer SolAero Technologies announced plans for a $10 million expansion that will add more than 100 jobs. New Mexico’s excellent and affordable solar energy resource is also an important reason that Facebook decided to build a new data center in the state. An investment of $45 million will fund three new solar facilities that will fully power the new facility and create hundreds of new jobs.

Photo Source: U.S. Department of Interior

In addition to jobs and local economic benefits, New Mexico’s existing renewable energy development is helping to curb power sector carbon emissions—the principal contributor to global warming— and other air pollutants like sulfur dioxide and particulates that harm state residents. These toxic pollutants are responsible for numerous health problems including aggravated asthma attacks, breathing problems, heart attacks, and premature deaths, especially in vulnerable and disadvantaged communities closest to the sources.

In strong contrast to fossil fuel generation, wind and solar power generation also use virtually no water, an incredibly valuable benefit in a water-constrained state like New Mexico. The American Wind Energy Association estimates that in 2015, the state’s wind projects avoided the consumption of 264 million gallons of water.

All of these economic and environmental benefits are poised to grow substantially if SB 312 is adopted and New Mexico accelerates its shift away from a heavy dependence on coal for power generation.

Joining the 50 percent (plus) club

New Mexico is not alone in its pursuit of a cleaner, safer, and more affordable energy system. Several states—including California, Oregon, New York, Vermont, Massachusetts, and Hawaii—have already expanded their RES targets to at least 50 percent (100 percent, in Hawaii’s case), and are implementing effective solutions to reliably integrate significant amounts of renewable energy on their power systems. Nevada is considering similar RES expansion legislation this year as well.

With a new federal administration seemingly determined to stay stuck in the fossil fuel age, this kind of state leadership is needed now more than ever. New Mexico should adopt SB 312 and set a course to fully embrace its renewable energy future. Doing so will deliver significant rewards for the state’s residents and set an example for other states to follow.

The Man Who Sued the EPA is Now Running It. What Does That Mean for the Environment?

Voting largely along party lines, Congress just confirmed Scott Pruitt as Administrator of the Environmental Protection Agency (EPA)—an attorney who has spent his professional career suing the EPA to stop the agency from performing its fundamental mission of ensuring clean air and water for all Americans. This confirmation marks a sharp break with precedent; most EPA Administrators from both parties have come to the office with a demonstrated commitment to the EPA’s mission.

One might even say that this vote signals the end of an era of bipartisan congressional support for a strong federal role in protecting our environment, as this newly confirmed Administrator is likely to dismantle the safeguards that both parties have supported since the 1970s.

What that means for all of us who care about clean air and water and the protection of our environment is this: It is up to us to monitor carefully what happens next, and to be prepared to spring into action as needed.

Here are some of the key developments I’m watching for:

Will Scott Pruitt recuse himself?

As repeatedly noted in his nomination hearing, Pruitt has represented the State of Oklahoma in numerous lawsuits against EPA. Many of these cases are still active today, directed at major EPA regulations, including the Clean Power Plan (which limits carbon emissions from power plants); national air quality standards; mercury emissions from coal plants; methane limits for the oil and natural gas excavation; and a Clean Water Act rule that clarifies federal jurisdiction over bodies of water.

During the nomination hearing, Pruitt did not commit to recusing himself from these cases, but he did say he would rely on advice from the EPA ethics counsel. Common sense tells us that he cannot possibly be impartial on these issues, and conflicts of interest abound. For example, the state attorneys general who joined him in the suit against the Clean Power Plan have written a letter to the Trump Administration, asking the President to issue an executive order declaring that the rule is unlawful. Responding to this request would, in the normal course of business, require EPA input, since it is an EPA regulation. How can Scott Pruitt possibly participate in any review of that request given that, just a few weeks ago, he himself was one of the attorneys general making this claim?

He must recuse himself, as thirty senators have made clear in a recent letter.

Will Scott Pruitt cut federal law enforcement?

As a candidate, Mr. Trump pledged to dismantle the EPA. He lacks a filibuster-proof majority to change the laws that created the EPA, such as the Clean Air and Clean Water Act. But he could cripple the EPA with budget cuts, which are much harder for a minority to stop.

By wide margins, most Americans favor enforcement of laws that protect our air and water. Cutting EPA enforcement will therefore be unpopular—but Scott Pruitt is likely to argue that we can rely on states to enforce environmental laws, so cutting the EPA’s budget won’t do any real harm.

This is a dangerous myth.

Having served as a state environmental commissioner, I know from personal experience that state environmental agencies are already strapped. They typically lack the technical experts employed at the EPA, and stand in no position to take on additional enforcement responsibilities shed by the EPA.

In Massachusetts where I served, for example, my former agency’s staff was cut nearly in half between 2002 and 2012 due to budget cuts, even as the agency’s responsibilities grew. That occurred in a state well known for its strong commitment to environmental protection. As a result, my agency was forced to cut back on important and effective programs, such as water sampling to locate sources of bacteria that pollute rivers. If the EPA’s budget is cut, it will mean even fewer resources for states, because states now receive a significant share of the EPA’s budget to cover enforcement activities.

Second, state environmental agencies sometimes experience political pressure against enforcement that might harm a large employer or impose significant costs on residents. We saw some of this in play in Flint, Michigan, where a state agency did not enforce a law requiring corrosion treatment of pipes to reduce lead contamination; it took an EPA staffer and outside scientists, as well as the residents themselves, to blow the whistle on lax state enforcement.

Third, states are not equipped to deal with the widespread problem of interstate pollution. To cite one of the most egregious examples, the state of Maryland could shut down virtually all in-state sources of air pollution and yet still not be in compliance with health-based air quality standards due to pollution from neighboring “upwind” states. A strong federal law enforcement presence is needed to address the simple fact that air and water pollutants do not honor state boundary lines.

We and others stand prepared to fight crippling budget cuts at the EPA, and explain that the protection of our air and water requires both federal and state environmental law enforcement.

Scott Pruitt will likely gut the Clean Power Plan; what will he replace it with?

Photo: Gage Skidmore/CC BY-SA (Flickr)

During the campaign, President Trump called for abolishing the Clean Power Plan, the EPA regulations that limit carbon emissions from power plants. And as noted, Administrator Pruitt sued to block it. It now seems nearly inevitable that he will move to drastically undermine the plan.

The question is, what will he propose to replace it? The EPA does not have the option of doing nothing. The United States Supreme Court ruled in 2007 that the EPA has a duty to regulate greenhouse gases under the Clean Air Act if it makes a determination that such gases endanger public health and the environment. In 2009, EPA made such a finding (which Mr. Pruitt fought, though unsuccessfully).

Thus, EPA remains obligated to regulate carbon dioxide emissions in general, and in particular with respect to power plants, which are among the nation’s largest source of these emissions.

One predictable approach would be a revised regulation that reduces emissions, but by a much smaller percentage. The current litigation over the Clean Power Plan could serve as a roadmap for a diminished rule. The Clean Power Plan relies on three strategies to reduce emissions—improving efficiency of coal plants, switching from coal to gas, and switching to renewables. During the litigation, Scott Pruitt conceded that the EPA had the authority to require improvements to coal plant efficiency, but claimed that the other two strategies, which go “beyond the fenceline” of an individual source, were unlawful.

Thus, one might expect that a revised rule will mirror what Mr. Pruitt called for in court. If so, rather than cutting carbon emissions by approximately 32 percent by 2030, the rule would result in barely noticeable emission reductions.

If this happens, litigation will be necessary. The court that mandated the EPA to address greenhouse gas emissions should not be satisfied with a rule that does little to cut one of the nation’s largest sources of CO2 emissions.

How about vehicles?

The second biggest carbon cutting program of the Obama Administration is the UCS-backed fuel economy standards for cars which, it is estimated, will roughly double fuel economy between 2012 and 2025. Those standards were agreed to by the automakers at the time. They are projected to cut billions of tons of CO2, reduce oil use by billions of barrels, and save consumers an average of $8000 over the lifetime of a vehicle.

When the standards were put in place, they included a “mid-term review” provision in which the EPA would assess whether changes in technology, costs, or factors might warrant a change to the standards. The review was to be completed by April 2018, but the Obama administration in its closing days completed the review and determined, based on a thorough review, that there was no reason to change the standards, since automakers are ahead of schedule in meeting these standards, and at a lower cost than originally predicted.

Some automakers are calling for this determination to be re-opened, presumably so that the rules can be modified and perhaps weakened. And one can justifiably be anxious that they could offer something that the Trump administration is keen to secure—a commitment to increased manufacturing in the United States—in exchange for relaxing these standards.

It would be a disaster for these historic standards to be rolled back, and we’ll fight any such rollback along with many allies.

How about science?

As I wrote recently, Mr. Pruitt’s record shows little evidence of deference to scientists. After all, he sued the EPA for relying upon the world’s most prominent climate scientists, including many employed by the federal government, in finding that greenhouse gases endangered the environment. And he claimed that the question of climate change and the role of human causes of it are still an open question for debate.

As EPA Administrator, he will hear from EPA scientists whose expert judgment will not align with his deregulatory agenda in some cases. Will these scientists’ findings be suppressed or disregarded?

We call on Mr. Pruitt to declare that scientific integrity is a core guiding principal for the EPA, that he will abide by the existing EPA scientific integrity policy, and even look for ways to improve it, as recommended by UCS.

Vigilance required

Scott Pruitt comes to his new position with the heavy baggage of having devoted a good part of his career to opposing EPA, not to mention the apparent antipathy of his boss towards the agency. The Trump transition team, composed of career ideologues, further fueled anxiety over the EPA’s fate, with threats of gag orders on agency scientists, deletion of climate data from the website, and draconian budget cuts. This is why we see, for example, hundreds of career civil servants risking their jobs by publicly protesting Mr. Pruitt’s confirmation.

Scott Pruitt has a chance now to push the reset button, and position himself as an open-minded and principled conservative, rather than a deregulatory ideologue. Most helpful to him will be to invest significant time in hearing from the agency’s talented scientists, engineers, policy analysts and attorneys.

No matter what, we will be watching his actions vigilantly and stand prepared to fight to retain key protections of Americans’ health and safety at the agency he now oversees.

Photo: justice.gov Photo: Gage Skidmore/CC BY-SA (Flickr)

Learning from Oroville Dam Disaster: State Water Board Proposes Climate Change Resolution

Earlier this week, while areas downstream of Oroville Dam were still under an evacuation order, California’s State Water Resources Control Board (State Water Board) released a draft resolution for a comprehensive response to climate change. It resolves that the agency will embed climate science into all of its existing work, both to mitigate greenhouse gas emissions, and to build resilience to the impacts of climate change. In doing so, the State Water Board demonstrates how public agencies can respond more proactively to the very real challenges that global warming is bringing our way.

A failure to plan is a plan to fail

After five years of record drought conditions, in just a couple months, California has received more rain than reservoirs can store. This may seem strange but it is exactly what climate scientists have predicted for the state since the 1980s: prolonged warm and dry conditions punctuated by intense wet spells, with more rain and less snow, causing both drought and floods.

Despite having a wealth of science at our fingertips describing how our water system is changing due to global warming, too often we have not put this information to use. During the federal relicensing of the Oroville Dam, the California Department of Water Resources (DWR) chose not to assess how climate change might affect the dam’s operation. In response to this “foundational error,” Butte County and Plumas County sued DWR. Their suit argues that the environmental analysis associated with the dam relicensing should be rejected as unscientific:

“Rather than rigorously assessing climate change, DWR’s Oroville FEIR [Final Environmental Impact Report] presumes that hydrologic variability from the previous century ‘is expected to continue in the foreseeable future’ and that it would be ‘speculative’ to further analyze other climate change scenarios…Due to this error, the FEIR is predicated upon a hypothetical future that DWR knows to be dangerously false.”

While we know that the past is no longer a predictor of the future, we continue to plan for the past. It’s easier, it’s seems less expensive, but it has huge, hidden costs. Costs now being borne by the nearly 200,000 residents who were evacuated, affected counties, and, eventually, taxpayers who will pay to repair the damage.

This is why it is incredibly important to plan for the future, and particularly more “extreme” climate conditions. We are on the precipice of giving away almost $3 billion of public money for new water infrastructure without requiring these new water projects use climate science and existing modeling results to assess how the proposed projects would fare under more “extreme” climate conditions. We have repeatedly encouraged the California Water Commission to require that new water projects provide a quantitative assessment of the impact of climate “extremes” on project operations. However, in December 2016, the California Water Commission approved regulations without this requirement.

State Water Board commits to using climate science

Mistakes are an inevitable part of life, but we need to learn from our mistakes. The State Water Board has taken an important step forward by drafting this resolution, which requires that the State and Regional Water Boards rely on sound modeling and analyses that incorporate relevant climate change data and model outputs to account for and address impacts of climate change in permits, plans, policies, and decisions.

There are many lessons from the Oroville Dam crisis, including the critical importance of using science to prepare for a future that will be different from the past due to global warming. We applaud the State Water Board for their leadership and hope other agencies will soon follow and commit to making better decisions using climate science.

Solar vs Nuclear: Is this the Last Chapter?

Last year’s solar deployment numbers just came in, and they are, in a word, phenomenal. Utilities bought more new solar capacity than they did natural gas capacity: an astounding 22 states added more than 100 MW of solar each.

At the same time, there is grim news about delays in construction and associated cost over-runs  for nuclear plant construction projects in Georgia and South Carolina. SCANA—owner of South Carolina Electric & Gas and sponsor of the VC Summer Nuclear Project—has just reported new delays in the in-service dates of its new reactors to 2020. Construction started more than 7 years ago, with energy deliveries promised to begin in 2016.

Neighbors with solar. Courtesy of Grid Alternatives.

Past hopes for a “renaissance” in nuclear power in the United States, with four to eight new nuclear plant facilities projected to come on line in America between 2016 and 2018, have been overwhelmed by competition. UCS predicted this trend in costs many times.

Great solar news

Meanwhile, there is much to say about the solar boom. Just ask one of your 1,300,000 neighbors who have solar on their property.

To put these achievements in perspective, let’s talk about solar jobs and productivity. The solar industry employs more than 260,000 people in the United States. The continuous improvement in know-how in construction techniques and in manufacturing drives down solar deployment costs every 3 months. The pricing for new solar projects is coming in the range of 4 cents (Texas) to 5 cents (California) per kilowatthour.

In comparison with nuclear, the amount of solar power built in 2016, taking into account how many hours each can operate each day, is the equivalent of more than 3 new nuclear plants.

To dive in a little deeper: let’s use a 25 percent capacity factor for new solar, making the 14,626 MW installed equivalent to 3,650 MW of theoretically perfectly running nuclear plants. The Westinghouse AP 1000 units under construction for the last 7-10+ years produce about 1,100 MW.  So, in one year, solar additions were equal to what takes more than 7 years to build. The difference in speed of deployment is why UCS is clear that nuclear power isn’t a near-term climate solution.

The demise of the nuclear option

In the energy business, nuclear is fading fast. Struggles to keep existing plants open in competitive markets are roiling the electricity markets. But the recent news about the very few manufacturing firms supplying nuclear construction illustrates how very different the nuclear industry is from solar.

Cost over-runs in the US plants are so large that when state regulators finally put a cap on what South Carolina and Georgia consumers would pay, manufacturer Toshiba (owner of Westinghouse) found itself with $6 Billion in losses and the likely end of its business in nuclear power plant construction.

The concentration of nuclear component manufacturing in so few companies has shown how a problem with quality led to a “single point of failure” plaguing the fleet of French nuclear plants. Policy in the US has been to shield the utility companies from the risks of their business decisions to construct nuclear plants, continuing with the Vogtle plant in Georgia.

Would we ever go 100% solar?

Would we ever build only solar? Maybe, but that’s not the right question. “What can we do with lots of solar?” is a better one.

We can keep absorbing the solar pattern of production with the tools we have. We can plan to adjust to cheap energy in the middle of the day with time-varying rates. And if we can get energy storage further along, we can get to the end of this debate.

Public Source

One Way You Can Help Fight Against Political Interference in Science: Tell Us About It

Since Election Day and into the first weeks of the Trump presidency, we’ve heard a lot about “alternative facts” and clampdowns on the ability of scientists to present scientific evidence or speak to the press. Congress last week signaled its intent to neutralize the Environmental Protection Agency and other federal departments by cutting science out of the way they make policy.

Truth and science cartoon

Federal employees can help create an accountable government by reporting political interference in science (even anonymously). More info: ucsusa.org/secureshare.

But together, we can raise the political price of manipulating science or censoring scientists by exposing these actions and publicly communicating their consequences for public health and the environment. Sometimes, this requires people within government or who are funded by government to speak up and share challenges that they experience or perceive.

Learn how to securely and/or anonymously communicate with UCS here. The shortlink is www.ucsusa.org/secureshare.

UCS has many years of experience working with government employees, journalists, and members of Congress to get stories out in a way that protects those with information to share. We want to hear about actions that compromise the ability of science to fully inform the policymaking process—and the consequences of those actions. We also want to hear your stories that describe how government data and government experts protect public health and safety.

Just as there are many steps in the policymaking process, so too are there many ways to attack and politicize science. People often think of the muzzling of scientists, or the censorship of documents. This happens, of course. But there are other, more subtle ways of inappropriately influencing how science is used to make decisions. A partial list is at the end of this post.

Political interference in science can be difficult to assess. It’s often not clear whether a person’s actions are normal or crossing the line—especially within an administration where some don’t want to leave a paper trail. To that end, feel free to share what you’ve heard or what you’ve been told verbally. Our staff are ready and willing to help you figure out the best course of action.

CensorMatic CartoonYou should also consider approaching the official who is responsible for implementing your agency’s scientific integrity policy for advice. Outside of government, in addition to UCS Public Employees for Environmental Responsibility, the Government Accountability Project, and the Climate Science Legal Defense Fund are all good resources for learning more about your rights and responsibilities.

Now that partial list of subtle and overt ways that vested interests have used to undermine or politicize science, in no particular order:

  1. Prevent scientists from publishing research, or delay publication of research (see: former EPA clearance process)
  2. Prevent scientists from presenting at or attending scientific meetings that are relevant to their work (see: airborne bacteria)
  3. Diminish or destroy agency scientific libraries and library content or similar resources (See EPA, Department of Fisheries Canada)
  4. Allow agencies with conflicts of interest to second-guess or undermine the work of agency scientists through the inter-agency review process (see: the chemical perchlorate)
  5. Require scientists to manipulate scientific methods (See: lead in children’s lunch boxes)
  6. Restrict the types of information and methods that experts can use (See: attempts to prevent climate scientists from using scientific models)
  7. Manipulate or censor scientific information in testimony before Congress (see: CDC testimony on climate change and public health)
  8. Place misinformation on official government websites (see: breast cancer)
  9. Redefine terms to prevent the successful application of science to policymaking (see: OMB peer review guidelines, critical habitat under the Endangered Species Act)
  10. Promote scientifically inaccurate educational curricula (see: abstinence-only sex education)
  11. Refuse to comply with court-mandated analysis (see: endangerment finding)
  12. Waste scientists’ time with baseless subpoenas or open records requests
  13. Manipulate agency scientific documents before release to create false uncertainty or otherwise change the scientific meaning (see: endangered species)
  14. Limit or prevent scientists from communicating with the media, the public, or Congress, including social media, or through requiring minders that sit in on interviews with agency scientists (see: numerous reports from journalists)
  15. Prevent scientists from speaking to the press, or have “minders” present to ensure that scientists say the “right” thing
  16. Selectively route interviews away from scientists with inconvenient scientific analysis (see climate change and hurricanes)
  17. Remove or decrease accessibility to government data sets, tools, models, and other scientific information, or stop collecting data altogether (see Canada’s Harper Government)
  18. Appoint technically unqualified people or people with clear conflicts of interest to federal science advisory committees (see childhood lead poisoning)
  19. Use political litmus tests for federal advisory committee membership (see workplace safety panel)
  20. Threaten, demote, or defund scientists who refuse to change information (see Vioxx)
  21. Create a hostile work environment that causes scientists to self-censor (see FDA surveillance)
  22. Disregard the law by not making decisions solely on best available science when statutorily required to do so (see air pollution limits)

Threats to science-based policymaking and public access to scientific information— essential components of democracy—have never been more real. But scientists are also ever more committed to defending the integrity of science in the policy making process. We depend on sources with knowledge of what’s happening within government to help us prevent a weakening of the federal scientific enterprise and the public protections that science informs.

Once again, that link for reporting what you see: www.ucsusa.org/secureshare.

UCS Founder Kurt Gottfried Wins AAAS Award

Kurt Gottfried, a founder of UCS in 1969 and a guiding spirit and intellect since then, has won the prestigious 2017 Scientific Freedom and Responsibility Award given by the American Association for the Advancement of Science (AAAS). AAAS is the world’s largest general scientific society and publisher of the journal Science.

I can’t think of anyone more deserving of this award, which recognizes Kurt’s lifetime of dedication and achievements. AAAS said it is to recognize Kurt’s “long and distinguished career as a ‘civic scientist,’ through his advocacy for arms control, human rights, and integrity in the use of science in public policy making.”

Source: UCS

Kurt receiving this award also means a lot to me personally, since he has been one of the biggest influences on my professional life. I first met him in 1978 when I took his quantum mechanics course as a physics grad student at Cornell. He was a wonderful teacher and communicator, and generations of students have learned the subject from his classic text book (now in its second edition).

But I actually got to know him a couple years later—early in the Reagan presidency—when we were part of a group at Cornell that brought high-level speakers to campus to talk about the nuclear arms race, which was heating up. I’ve been privileged to have continued to work with him since that time. Kurt’s way of thinking about the world and approaching the problems he worked on have helped shaped my own.

Kurt’s history

I would guess that even the people who know him may not be aware of the range of activities Kurt has taken on over the years.

Kurt was born in Vienna, Austria, in 1929. He has had a long and distinguished career as a theoretical physicist. He received his PhD from MIT, became a Junior Fellow at Harvard, and has been a physics professor (now emeritus) at Cornell since 1964.

At the same time, he has dedicated boundless energy to improving the world, in areas including international security and nuclear arms control, human rights, and preventing political intervention in scientific input in policymaking. For example:

Science, International Security, and Arms Control

On leave at MIT in 1968-9, Kurt helped draft a statement encouraging scientists to consider society’s use of technical knowledge, and calling on scientists and engineers across the country to join a national effort to discuss these issues in university classes on March 4, 1969.

Following the success of that effort, Kurt co-founded UCS that same year. His goal was to help scientists bring their expertise to bear on public policy issues that had an important technical component. From the beginning, the vision was to build a research and advocacy organization that combined technical experts with experts in policy analysis, media engagement, and outreach and education for the public and policy makers, while keeping issues of science and technology at the core of its work.

Today, UCS has grown to more than 180 staff members and has an annual budget of more than $27 million. More than 45 years after UCS’ founding, Kurt remains a valuable member of the Board of Directors.

Over the years, UCS not only helped inform debates and shape policy on a wide range of issues, it also helped legitimize the active role of scientists in these debates and created staff positions allowing scientists to work on these issues full time. And it helped engage a broad set of scientists in part-time policy work, educating them about the issues and training them in writing and speaking for policy makers.

Working with UCS, Kurt was among the first people to raise concerns about the development of missile defenses, co-authoring a report on the topic in 1969. Kurt and UCS were particularly active in the debate in the 1980s and 1990s following President Reagan’s “Star Wars” speech. Kurt weighed in with articles and op-eds in Scientific American, the New York Times, the Washington Post, and elsewhere, and co-authored the influential books The Fallacy of Star Wars (1984) and Countermeasures: A Technical Evaluation of the Planned U.S. National Missile Defense System (2000).

Kurt at a 2000 press conference in Washington. Source: UCS

Kurt also worked to prevent the development of anti-satellite weapons and weapons based in space. He wrote and spoke widely about this issue and worked with Dick Garwin to develop a draft treaty banning anti-satellite weapons, which he presented to the Senate and House Foreign Relations Committees in 1983 and 1984.

In addition, he authored or co-authored articles on nuclear weapons, command and control systems and crisis stability, and cooperative security in Nature, the New York Review of Books, and elsewhere. He edited two books on these issues—Crisis Stability and Nuclear War (1988), and Reforging European Security: From Confrontation to Cooperation (1990)—and contributed chapters to several others.

Scientists and Human Rights

Kurt was also very active in human rights issues for many years—activities he undertook outside his work with UCS. During the 1980s he traveled to the Soviet Union to meet with and support refuseniks, and he urged others in the scientific community to actively support these dissidents.

Kurt was a major figure in the American Physical Society (APS) Committee on International Freedom of Scientists (CIFS), which helped oppressed scientists in the Soviet Union and other countries. CIFS described its goal as:

The Committee was formed to deal with those matters of an international nature that endanger the abilities of scientists to function as scientists. The Committee is to be particularly concerned with acts of governments or organizations, which through violation of generally recognized human rights, restrict or destroy the ability of scientists to function as such.

Kurt served as CIFS’ first chair in 1980 and 1981. One of CIFS’ innovations was its use of “small committees,” typically consisting of three or four people, who would pick a persecuted scientist and regularly write to the scientist and his/her family, friends, and local officials.

Even when these letters were intercepted by the authorities, they raised the profile of the scientist and made clear that international attention was focused on this person. By 1983, these committees were writing to 63 scientists, and the number continued to increase through the mid-1980s.

Kurt also helped found the organization Scientists for Sakharov, Orlov, and Sharansky (SOS) to focus attention on three of the most prominent Soviet refuseniks. He served on the SOS Executive Committee from 1978-90. SOS’s call for a moratorium on scientific cooperation with the Soviet Union to highlight concern about the treatment of scientists was joined by nearly 8,000 scientists and engineers from 44 countries, and gained international attention.

Soviet physicist Yuri Orlov was jailed for a decade in the Soviet Union after forming Moscow Helsinki Watch to monitor Soviet actions on human rights after it signed the Helsinki Accords in 1975. Kurt’s involvement in his case led to Orlov coming to Cornell after his release in 1986 and joining the physics faculty.

Kurt was also instrumental in winning the release in 1978 of the physicist Elena Sevilla, who was imprisoned in Argentina because of political activities by her husband, a newspaper reporter. On her release, Kurt arranged for her to come to Cornell to finish her graduate studies in physics.

Kurt’s work not only helped the refuseniks and other oppressed scientists. His actions over the years have helped inspire others in the scientific community to recognize and act on their ability and responsibility to help scientists who were denied basic human rights.

For his work on these issues, Kurt was awarded the APS Leo Szilard Award in 1992.

Scientific Integrity/Science and Democracy

In the wake of growing evidence that some officials in the George W. Bush Administration were distorting scientific knowledge and the scientific advisory process to an unprecedented degree, Kurt recruited 62 preeminent scientists to sign a statement titled Restoring Scientific Integrity in Policy Making, which was released in February 2004.

The statement charged the Bush Administration with widespread “manipulation of the process through which science enters into its decisions” and called out the administration’s misrepresentation of scientific evidence, appointment of unqualified members of scientific advisory committees, and silencing of federal government scientists—actions that threatened the integrity of science in policy making.

The statement drew wide public attention to these issues. It was signed on-line by more than 12,000 scientists.

Subsequently, Kurt led the effort to create a new program at UCS to work on this issue, which researched examples of abuse, engaged the scientific community on this issue, and worked with administration agencies to reform their practices, including writing draft rules on scientific integrity for these agencies. Kurt was also the force behind evolving that program into the UCS Center for Science and Democracy in 2012, arguing there was a need to address a broader set of issues related to the role of science and evidence-based analysis in democratic society.

* * *

Kurt, Hans Bethe, Dick Garwin, and Henry Kendall at a press conference on missile defense, March 22, 1984 (Source: James J. MacKenzie)

For half a century, Kurt has engaged the scientific community, policy makers, and the general public on important issues related to international security, human rights, and the role of science in democratic society. Moreover, he has encouraged his colleagues to become involved, mentored younger scientists in these issues, and created an organization that has magnified his efforts and will continue this work well beyond his lifetime.

Kurt has been an inspiration to me and other scientists who decided to make a career of applying our technical backgrounds to important policy issues, and helped break the ground to make a career of this kind more possible.

Love Local Food? Here’s a Promising Way to Protect the Local Land that Grows It

Does your heart beet for farmer’s markets? Do you carrot all about protecting the soil? This Valentine’s Day, lettuce dive deeper into a promising solution for simultaneously protecting land for local food production, ensuring more sustainable agriculture, and creating opportunities for beginning farmers: land trusts.

If you heart local food, it is important to remember that farmland for the food needs protecting, and land trusts are one part of the solution.

Agriculture puns aside, land trusts are nonprofit organizations designed to protect land in perpetuity. Essentially, landowners donate or sell the long-term rights on their property to a land trust—an outside organization that ensures that in the future land is only used for specific purposes, such as for wildlife habitat or agriculture.

There are several reasons why agricultural land trusts can be beneficial. The American Farmland Trust estimates that 40 acres of farmland (roughly the size of 36 football fields) are lost every hour to urban sprawl and development in the United States (that’s over 350,000 acres per year). And there is also no shortage of concerns around existing agricultural lands, including water pollution, soil degradation, and a recent dramatic drop-off in farm incomes. Agricultural land loss and degradation necessitate conservation options such as trusts.

Protecting land for beginner farmers and sustainable agriculture

Land trusts, such as the Sustainable Iowa Land Trust (SILT), are non-profit organizations that work with landowners to facilitate different arrangements, such as long-term leases or land donations that legally protect or ensure particular uses of land in the future. Land trusts fill an important need in facilitating the major transfer of land that is anticipated in agriculture because the average farmer’s age is 58, combined with growing competition for land use from urbanization and energy development. Suzan Erem, SILT’s Board President, pointedly reminded me that “the history of the U.S. is that we haven’t seen cities shrink”. Photo: SILT.

One example of an organization with a dedicated focus on sustainable agriculture is the Sustainable Iowa Land Trust (SILT). SILT launched in 2015 with a mission to permanently protect land to grow healthy food, and this is the major distinction between SILT and other non-profit land trusts: the requirement for sustainable food production on their farms. While most land trust agreements include prohibitive language to prevent development-related activities, SILT also adds affirmative language requiring sustainable farming (defined by several different sustainability certifications).

SILT also hopes that more and more landowners will donate or participate in long-term leases through their model to institutionalize affordable land access. This will help make land—particularly land for sustainable food production—available so that it is not just about “where you’re born or sheer dumb luck,” according to Suzan Erem, SILT’s Board President. SILT is proud of its relationships with both national organizations such as the National Young Farmer’s Coalition and statewide programs including Lutheran Services, which assists refugee populations in finding land to launch farm businesses.

That’s another crucial benefit of SILT’s approach: landowners who hope to preserve the integrity of their land are paired with beginner farmers looking for an affordable way to get started. Erem explains that the popularity of programs like SILT is related to the excitement of seeing it “giving people a place and a purpose,” and because they provide opportunity to “redefine what you can do with your legacy.”

Local food demand and supporting midsize farms are further reasons to protect agricultural land near cities

Another important piece of this puzzle is strong consumer demand for local food. Late last year, USDA released the results of their first-ever survey of direct marketing (food products sold by farmers directly to consumers, retailers, institutions or other local food intermediaries), and reported that total sales across the country generated this way were an estimated $8.7 billion. The survey estimated that 67% of these sales were from farms located in metropolitan counties and that the 38% of producers responsible for these sales were women (a greater proportion of women than in the general farming population), and 14% were veterans. As I’ve noted previously, women and veterans are groups that have plenty of room to expand in the agricultural sector.

One component of the most profitable farms—regardless of size—is direct marketing, as Dr. Dawn Thilmany McFadden, a member of our Science Network, explained in a blog post last year. This form of sales is particularly important to protect “agriculture of the middle” or midsize farms and ranches, which have been declining for many decades (a trend likely to worsen under the present tightening agricultural economy). Growing Economies, our 2016 report, similarly noted that more direct sales from institutional food purchasers could be a multi-billion dollar boon for the state of Iowa.

Despite the benefits of protecting local farms and food, it’s important to recognize that local food is certainly not a panacea for all environmental concerns. Tradeoffs with impacts such as greenhouse gas emissions require careful consideration, as another Science Network colleague, Dr. David Cleveland, recently noted on our blog. Still, given the stimulus for local economies, and the need to protect farmland in general, how we protect land for local food deserves an important part of the conversation.

And remember for Valentine’s Day, let’s turnip attention to the idea that land trusts and local food make a great pear!

How to Ensure Self-Driving Vehicles Don’t Ruin Everything

Zipcar’s former CEO has cast the self-driving future as a “heaven or hell” scenario, and she has a point. Self-driving cars could save lives, smooth traffic congestion, expand access to jobs or schools—especially for people who can’t drive themselves today—and reduce the number of vehicles on our roads. On the other hand, they could worsen smog and local air quality pollution, disrupt the US economy by putting millions of people out of work, justify cuts in public transit funding and services, and force urban planners to focus more on providing space for vehicles instead of for parks, bicyclists, or pedestrians.

To maximize the potential benefits of self-driving vehicles and minimize their potential consequences, UCS developed this set of principles that we will be pushing policymakers, businesses, and other stakeholders to follow. Doing so will ensure that self-driving vehicles reduce oil consumption and global warming emissions, protect public health, and enhance mobility for everyone.

Science-based policy will be key for shaping the introduction of self-driving technology

Many are rallying against any regulation of self-driving technology beyond ensuring it’s safe to use. I’ve even heard the claim that over regulating this technology will literally kill people by slowing the speed at which self-driving cars are introduced, thus delaying their potential safety benefits.

To be fair, this argument has merit. Self-driving vehicles are forecast to reduce the tens of thousands roadway fatalities that occur each year in the US by as much as 90 percent, and can offset the rise of distracted driving that may have caused the biggest spike in traffic deaths in 50 years (though reaching these improved safety levels will take further advances in the technology and widespread deployment).

But, self-driving technology won’t just impact transportation safety. Researchers are forecasting how it will affect traffic congestion, vehicle-related emissions, land-use decisions, public transit systems, data security, and the economy. Unfortunately, the emphasis that many, including the US Department of Transportation, have placed on the safety benefits can be distracting from the need to consider how policy should address the other equally great potential impacts of self-driving technology.

I’m not saying self-driving technology should be regulated to the scrapheap. The technology is highly likely to improve traffic safety and increase access to transportation—both important outcomes. Yet self-driving vehicles will need to be regulated on issues other than safety, as their full breadth of potential impacts won’t be addressed by safety-focused policy or market forces alone.

For example, studies have found that self-driving vehicles could double transportation emissions (already the largest source of climate change emissions in the US), place millions Americans out of work as automated driving replaces truckers and taxi drivers, and/or exacerbate urban sprawl.

The jackpot for winning the race to produce the best self-driving vehicle can still be won even if these negative affects are suffered, and today’s policy frameworks may be insufficient to effectively curtail these future impacts. Let’s not forget that automakers have historically been against regulation (see: seat belts, fuel economy, air bags) and are encouraging policymakers to clear the way for self-driving vehicles not only because they seek to improve transportation safety, but because they see a potential to make a profit.

So science-based policy covering the broader implications of self-driving cars, including how they affect emissions and our economy, will be needed to ensure the best possible self-driving future and these discussions need to happen today. To kickstart these conversations, UCS released these principles that will create a safe, healthy, and equitable autonomous future. Join the conversation on whether and how self-driving technology should be regulated by checking out our new self-driving vehicle web content and signing up for future action alerts here.

Can Republicans Find Their Voice on Climate Change via a Carbon Tax?

Earlier this week a group of conservative opinion leaders and experts launched the Climate Leadership Council, championing a national carbon tax to cut emissions and help achieve climate goals.

As with any carbon pricing proposal, the politics are complicated and there is no telling how much traction this particular initiative will get. There are also definite concerns about some of the details of the proposal. But it’s very encouraging to see a meaningful solution to climate change put forth by conservatives. I look forward to seeing where this will go, especially with Republican lawmakers and the Trump administration.

Starting from the facts

This proposal begins with recognizing the scientific facts about climate change and the urgency of acting on solutions. To see leading conservatives articulate those basic realities is important, and I hope Republicans in Congress and the Trump administration are listening.

Climate change should not be a partisan issue. There’s no time to waste on the dangerous new types of denial or delay tactics that were in evidence during the nomination hearings for Rex Tillerson and Scott Pruitt, for example.

Just like the near-universal consensus among climate scientists about the facts of climate change, there is an overwhelming consensus among economists that a carbon price is an essential policy tool for driving down carbon emissions. The CLC proposal’s starting price of $40/ton CO2, escalating over time, shows the seriousness of their proposal.

What’s more, the authors of the proposal recognize that we have to act on climate as a global community and the US must live up to its international commitments under the Paris Climate Agreement. Yes, to meet long term climate goals countries will have to do a lot more than they have currently committed to, but walking away from the Paris Agreement would be a serious mistake.

Notes of caution

There is obviously room for discussion about ways to improve the policy proposal, as and when it gets serious consideration from policymakers. Some aspects of the proposal that could definitely use further scrutiny include:

  • Regulatory rollbacks that harm public health or undermine key legal protections are cause for concern. The EPA’s authority to regulate global warming emissions is a critical safeguard that cannot be negotiated away. There may be middle ground possible here but further conversations with a wide set of stakeholders, including environmental justice groups, are critical.
  • A carbon price alone will not be sufficient to deliver on the deep emission reductions consistent with climate goals; we need complementary policies to address other market failures. For example, policy incentives for innovation in low carbon technologies are important. In sectors like transportation, a small surcharge on fuel prices won’t be enough to drive the big changes needed in vehicle fleets and the investments in infrastructure for public transit or electric vehicles so other policies are needed. And we need policies to address non-CO2 emissions, such as methane.
  • What happens with the (considerable) carbon revenues is obviously a hugely important policy choice that must be made in consultation with lawmakers, with the interests of the broader public squarely in mind. Priorities—such as appropriately offsetting the disproportionate impacts of energy price increases associated with a carbon tax; transition assistance for coal workers and coal-dependent communities; assistance for communities facing climate impacts, especially frontline low income and minority communities; and investments in low-carbon infrastructure—require dedicated funding which could come from carbon revenues, or would require appropriations from Congress.
Getting (back) to bipartisan approaches on climate policy

In recent years, views on climate change have become politicized to the point that climate denial has become a form of tribal identity for most conservative-leaning politicians, and one more instance of the ‘just say no’ approach to any issue championed by the Obama administration.

Given the anti-science rhetoric from many Republicans in Congress, it’s hard to remember that there was a time when climate change was not a partisan issue. There was a time when Senators John McCain and Lindsey Graham and other leading Republicans not only openly accepted climate science but worked hard, together with Democrats, to find bipartisan solutions.

We got tantalizingly close to a national climate policy in the form of the American Clean Energy and Security Act of 2009 (aka the Waxman-Markey bill), which passed the House but was never brought to a Senate vote because of insufficient support. The failure of that legislative effort is what led to the EPA’s Clean Power Plan as an alternative. Regulation was not the first choice of the Democrats or of the Obama administration.

There is lots of blame to go around about how and why bipartisan approaches to addressing climate change have failed thus far. But we don’t have the luxury to wallow in past mistakes; we have to break through the partisan divide and act on climate now.

And that’s why I am particularly encouraged by a proposal from conservatives that attempts to bridge that divide, albeit imperfectly.

The future can be different

Call me a delusional optimist, but I fervently hope that Republicans in Congress will now feel free to acknowledge the reality of climate change because that position will no longer be associated with a Democratic administration. And that they will work to advance solutions that can help meet the urgency of the challenge we face.

Even during the Obama years, there were some who stepped out of the party line, including a group of Republicans who joined the bipartisan Climate Solutions Caucus in the House and those who signed on to the Gibson Resolution.

Yesterday, along with the news of the CLC carbon tax proposal, we also heard news of four new members added to the bipartisan Climate Solutions Caucus. The Caucus now has 12 Republican members and 12 Democratic members.

Maybe these types of bipartisan efforts will grow in strength and size and we will get to a political tipping point on climate action. Maybe climate science and smart solutions can take center stage instead of partisan politics. One can hope this happens soon…

Actually, no. Hope is simply not enough. We need action urgently.

Republicans (and Democrats) must step up

We cannot afford another four years of denial, obstruction, artful lies, and ‘just say no’ politics, aided by fossil fuel interests. Climate impacts are already imposing harms on Americans and a costly burden on our economy. The recent climate data are stunning and sobering. Just a few examples:

Meanwhile, solutions like ramping up wind and solar energy are getting cheaper every year, and bring the promise of huge new economic opportunities IF we accelerate the momentum already underway.

Let’s build that clean energy infrastructure and create jobs. Let’s cut pollution from fossil fuels that causes numerous health problems including exacerbating asthma in children, and contributing to other types of heart and lung ailments, and even premature death. Let’s help coastal communities struggling with flooding worsened by sea level rise.

And let’s put a price on carbon while we’re going about it. There’s nothing partisan about any part of this bright vision for our future.

Still waiting for Republican leadership on climate change

Of course, President Trump must also show leadership from the top. His administration’s threats to dismantle existing climate and energy policies without any clear alternative plan are not a promising start. Thus far, the administration doesn’t show any indication of an interest in helping Americans facing the impacts of climate change, or recognizing the serious consequences of our continued dependence on fossil fuels.

If the president won’t lead, then Congress—including members of his own party—needs to have the courage to hold him accountable and advance their own climate solutions, perhaps along the lines of the CLC proposal.

The future will not be kind to this Congress and this administration if all they do is continue to find new creative ways to deny the science and dodge their responsibility to act on climate. We the people—Democrats, Republicans, and Independents alike—deserve much better from our government.

Electricity Rates Are Sorely Outdated. Let’s Give them an Upgrade.

Last month, to great and enthusiastic email fanfare, my utility presented me with a redesigned electricity bill. One meant to help me better understand the various costs and components that make up the final amount due. In an entirely relatable manner, my household met such news with chortles of joy. What a day!

But the utility’s trick? Colors and a stacked bar chart. They were nice colors, and yet…it proved a letdown. If our electricity bills contained just a bit more of the right information, we could collectively be saving billions of dollars a year, reducing air pollution all around us, and helping to bring ever more renewables online—a true step forward toward our vision of the modern grid. Now tell me that’s not a neat trick.

Shining a light on system costs

So what’s the right information, and how do we get it? Time-varying electricity rates, or rates that go up and down to let us know when it’s costlier and less efficient to be using electricity, and when it’s cheaper and cleaner.

As my colleagues and I explain in a new issue brief Flipping the Switch for a Cleaner Grid, with that extra information, we can make more informed decisions about how and when to use electricity, and save money and clean our air in the process.

Right now, most of us get charged the same flat rate for electricity no matter when we use it. But in reality, the actual cost to the system varies widely over times of day, days of week, and even seasons. These fluctuations in price are driven in large part by the need to meet ever-changing customer demand.

In particular, though we can’t see it with flat rates, our last bits of ill-timed load can mean sky-high prices as the system powers up inefficient plants, which we pay to build and maintain even though we use them for just a small amount of time each year. Talk about a wasteful design. By using price signals to mobilize flexible demand, time-varying rates flip this operations paradigm on its head.

Rates as guides

Time-varying rates use price signals to encourage customers to use electricity at some times and not others. Credit: UCS.

Time-varying rates are designed to encourage customers to alter when and how they use electricity. Different structures go about it in different ways to target different points of inefficiency. The figure on the right shows three of the most common forms: time-of-use (TOU) rates, critical peak pricing (CPP), and real-time pricing.

  • TOU rates (top right) target daily repeating patterns of peak and off-peak periods,
  • CPP rates (middle right) focus on just those few hours a few days a year when electricity use is at its very highest, and
  • Real-time pricing (bottom right) approximates the actual system cost in 5-minute to 1-hour intervals, which allows interested customers to best take advantage of the dynamic up-and-down swings of prices.

Time-based rates are not new; in particular, TOU and CPP rates have been around for a long time, especially for commercial and industrial electricity customers. However, it’s only been with the recent deployment of tens of millions of smart meters over the last few years that wide-scale, administratively low-cost programs have been more readily attainable at the residential level.

Still, except for a few places where state-wide implementation of time-varying rates is on the table (see California and Massachusetts, for example), most utilities continue to see these rates as a boutique approach.

Put me in, Coach!

Despite their simplicity, time-varying rates can create significant outcomes for the grid by shepherding lots of individuals into taking small actions at the same time—in aggregate, all these little contributions can add up to major effects. Take a look at the below example out of New England to get a sense:

New Englanders move as one when the Patriots are in the Super Bowl–namely, to in front of the TV at start time, and into the kitchen at the half. Credit: ISO-NE.

The left panel shows the load curve, or total electricity demand, for a regular winter Sunday in 2012; the right shows Super Bowl Sunday of that year, when New England played New York. Notice the narrowing of the peak and the spikes on the far right of the Super Bowl curve around 6:30, 8, and 10 p.m.? They correspond with the start, half-time, and end of game, respectively.

Now the half-time spike might look small, but it’s actually in the range of a whole natural gas generator needing to come online. Time-varying rates provide a mechanism for coordinating that type of chip-and-dip-refill fervor in our everyday lives.

In practice, the options for shifting demand run from simple to high-tech. For example, doing something like pressing the “delay start” button on a dishwasher (or just waiting to press start) is an easy, no-upgrades-required fix. On the other hand, some forms of flexibility require a technology intervention before they can be used, like turning water heater tanks—commonly a large residential electricity load—into energy storage devices that heat water during off-peak periods for use whenever needed. Because these resources can be so valuable to the system overall, it can be worth it for utilities to sponsor some of the upgrades themselves.

Excitingly, the recent mass deployment of smart meters means that many new opportunities for shifting electricity use and responding to price signals are beginning to be explored. In particular, innovation around third-party aggregators controlling electricity-dependent devices—from air conditioners to electric vehicles, in ways that are imperceptible to users—could mean even bigger opportunities for savings.

Still, it’s important to look back at that Super Bowl example to remember that it doesn’t actually take much to make a big difference to the grid, and that what we can do today is already a lot.

Fast-tracking our clean energy future

When we talk about the benefits of flexible demand—including those resulting from time-varying rates—we usually focus on the immediate (and persistent) cost savings that occur from not bringing those last costly power plants online. But such benefits are only the beginning of the story. This is especially the case when we consider the needs our grid will have as we race toward a clean energy future supplied by vast amounts of renewable resources.

Time-varying rates can help support a brighter, cleaner, more joyful wind-powered world. Credit: UCS.

Because wind and solar power production is variable, we need ways to fill the gaps when the wind eases or a cloud passes. Additionally, as more and more solar power comes online, the grid can start to run into challenges when the sun sets; solar resources decrease electricity production right around when people are returning home for the night and starting to use lots of electricity.

To manage this variation, we’ve traditionally relied on fossil-fueled power plants. But that reliance comes with a number of strings attached, and often at the expense of renewables, as my colleagues in California have detailed.

Enter flexible demand. If we can guide electricity use to times when our renewable resources are most abundant—and away from when they aren’t—we can take a vitally important step forward on the path to a clean energy future, and make the many and varied goals of our modern, clean grid easier to reach.

Critically, to ensure that access to these benefits is equitable and widespread, it takes a well-designed, well-considered program, as we lay out in our issue brief and as our peers have been diligently monitoring in California.

Think time-varying rates are neat? Take a peek at all the other wonders of an upgraded grid

Here at UCS, we’re working hard to make sure the electricity grid is ready and able to bolster our vision of a clean energy future. Time-varying rates, and their ability to unleash the incredible power of flexible demand, are but one part of this vision. In the time to come, my colleagues and I will be sharing exactly how we see upgrades to the grid enabling this pursuit; for now, though, allow our new video calling for an upgraded grid to brightly shine a light:

Congress is Trying to Protect Federal Scientists Because President Trump Isn’t

Today members of the Senate, led by Senator Bill Nelson, introduced a bill to strengthen scientific integrity in federal decision making. If ever there was a time that such a bill is needed, it is now.

Today, members of Congress introduce a bill to strengthen scientific integrity at federal agencies and enhance protections for government scientists. Photo: USDA

The Trump administration has already revealed its disrespect for the use of science in federal decision-making. From instating sweeping gag orders on federal scientists right out of the gate, to across-the-board hiring freezes and disruptive holds on grants and contracts, early indications suggest that this administration is not likely to be a leader in championing scientific integrity in government decision-making.

Moreover, the administration’s pick to lead the EPA, Scott Pruitt, has expressed limited understanding and respect for the EPA’s scientific integrity policy, noting in his confirmation hearing, “I expect to learn more about EPA’s scientific integrity policies.” In the face of such abuses, a move to strengthen scientific integrity at federal agencies is certainly welcome.

A bill to strengthen federal scientific integrity

Aimed “to protect scientific integrity in federal research and policymaking,” the bill requires federal agencies that fund or conduct science to adopt and implement scientific integrity policies, an idea initially introduced by the Obama administration in 2009. Specifically, the bill compels science agencies to develop scientific integrity policies that include specific provisions to enhance scientific integrity.

Importantly, the bill reinforces key elements of some federal agencies’ scientific integrity policies. It includes provisions requiring agencies to develop procedures that allow scientists to review and ensure the accuracy of public-facing materials that rely on their work, such as reports, press releases, and factsheets. This provision could help safeguard against political interference that might come from political appointees or public affairs staff that edit scientific documents before they are released. This type of political interference happened in several instances under the George W. Bush administration. Julie MacDonald, for example, a political appointee at the Department of the Interior, edited the scientific content of a document that served as evidence for listing the sage grouse under the Endangered Species Act.

A safeguard against such abuse could prove useful under a Trump administration, which has already suggested that it will emphasize uncertainty on climate science on NOAA websites and appears to be keeping a tight control on agencies’ scientific communications. The provision could be made even stronger by granting scientists the right to approve the scientific content of the public-facing materials that rely on their work.

Preventing political tampering

Another provision of the bill requires agencies to develop procedures that “identify, evaluate the merits of, and address instances in which the scientific process or the integrity of scientific and technological information may be compromised.” This is an important inclusion since to date, not all scientific integrity policies at federal agencies have detailed procedures for assessing the validity of and addressing allegations of scientific integrity abuses.

This lack of clarity in current agency policies has had damaging impacts on scientists who raise, or are accused of, scientific integrity violations. A scientist at Los Alamos National Laboratory, for example, appeared to have lost his job over publishing a paper that the Department of Energy didn’t like. When a scientist at the US Department of Agriculture was accused of violating the scientific integrity policy, he was subjected to a long review process that may not have included an independent assessment of the claims. Thankfully, both the DOE and USDA have revised their scientific integrity policies to strengthen the allegation evaluation procedures.  A law requiring all science agencies to make allegation procedures clearer would improve evaluation of scientific integrity violations across the government and give federal scientists fairer assessments.

The bill also requires the National Academy of Public Administration to conduct a study of scientific integrity across the government. This is a great idea and one that was included in our recent policy recommendations to the Trump administration. An independent assessment of the effectiveness of scientific integrity policies would provide illuminating findings on how the relatively new policies and procedures could be further improved.

A positive step in uncertain times

To date, 24 federal agencies have developed scientific integrity policies. The policies vary in quality, but in general they afford federal scientists rights to communicate, and include provisions to safeguard against political interference in science-based decisions. The bill would strengthen these provisions by uniformly applying some basic protections across all science agencies. This raises the floor on scientific integrity in the government.

Kudos to all the 26 senators co-sponsoring this welcome legislation. This includes the ranking members of the Senate Environment and Public Works Committee (Sen. Carper), the Senate Energy and Natural Resources Committee (Sen. Cantwell), the Senate Health, Labor, and Pensions Committee (Sen. Murray), the Senate Armed Services Committee (Sen. Reed), and of course, the Senate Commerce, Justice, Science Committee (Sen. Nelson).

Advancing Scientific Integrity Through Federal Advisory Committees

Back in October, I provided a comment at a public meeting for a National Academies of Science, Engineering and Medicine (NASEM) advisory committee that was set up to review the process to update the Dietary Guidelines for Americans. Their first charge was to write a report with recommendations on how the Dietary Guidelines Advisory Committee (DGAC) selection process could be improved to provide more transparency, minimize bias, and include committee members with a range of viewpoints.

After some time to assess the DGAC’s process and consider the public feedback they received, the committee released the report last Friday. It includes several important proposals that would be beneficial for the DGAC, and really all federal advisory committees (FACs), to employ. My assessment of the report will come later, but first, I want to talk a little bit more about the importance of FACs, generally.

Quick facts on FACs

FACs play an indispensible role in providing independent science advice for our government’s decision making. The government relies on this technical advice from scientists outside the government on everything from drug approvals to air pollution standards to appropriate pesticide use. There are over 1,000 advisory panels within the federal government, some of which offer technical scientific advice that may be used by agencies to inform key policy decisions. Some advisory committees are mandated by law, while others are created for ad hoc policy guidance. The Federal Advisory Committee Act requires that agencies take measures to ensure transparency and ample public participation, but how and the degree to which these are implemented varies depending on the agency.

In our most recent report, “Preserving Scientific Integrity in Federal Policymaking,” we discuss the opportunity to improve the way in which federal agencies obtain objective technical advice from advisory committees so that conflicts of interest are minimized and fully disclosed. Several studies have shown a positive association between authors’ financial conflicts of interest and recommendations that benefit those vested interests. Likewise, an individual on an advisory committee may choose to sideline the evidence and instead make recommendations that favor his or her special interest, especially if they stand to profit in some way. Federal advisory committees have been co-opted by industry for political reasons before, including when G.W. Bush administration officials pushed existing committee members out and replaced them with appointees in order to reject the prospect of stricter lead poisoning standards.

The DGAC plays the essential role of analyzing heaps of nutrition and epidemiological data and making recommendations to the U.S. Department of Agriculture (USDA) and the Department of Health and Human Services (HHS) to inform the Dietary Guidelines for Americans that is released every five years. As a lover of food and a student of food policy, I rely on the DGAC to translate science into objective recommendations that will ultimately shape federal nutrition guidance and regulations spanning from school lunches to nutrition facts labels. UCS commended the DGAC on its 2015 report to HHS and USDA, most notably for the way in which it followed the science to recommend that Americans consume no more than 10 percent of daily calories from added sugars.

NASEM’s report challenges undue influence of science

The NASEM committee’s report identified five values upfront that would enhance the integrity of the DGAC selection process, which closely echo the core values we identified for ensuring scientific integrity in federal policymaking:

  • Enhance transparency
  • Promote diversity of expertise and experience
  • Support a deliberative process
  • Manage biases and conflicts of interest
  • Adopt state-of-the-art processes and methods

For the reasons I mentioned earlier, the fourth value could use strengthening to something more like “Minimize and manage biases and conflicts of interest,” to emphasize that conflicts should be avoided, if possible, to maximize objectivity.

Figure: NASEM

As for its concrete guidance, the NASEM committee suggested changes to HHS and USDA’s process (see figure at right), including that when the departments first solicit nominations for the DGAC, they should “employ a third party to review nominations for qualified candidates.” This would add a crucial layer of independent review into the process, especially if, as NASEM recommends, the third party is an “organization without a political, economic, or ideological identity,” and not necessarily an expert in nutrition or dietary guidance. The NASEM committee would also add a public comment period after the provisional committee is selected by the departments, allowing an opportunity for the public to weigh in on any potential biases or conflicts of interest of the proposed members. We strongly agree with NASEM’s assertion that “candid information from the public about proposed members is critical for a deliberative process.”

The report also recommended that the departments create and make public strict policies on how to identify and manage conflicts of interest and mandate that committee members sign a form that captures nonfinancial conflicts of interest and biases, since that is not currently covered by the required Office of Government Ethics form. Additionally, the committee elaborated on what “management” of conflicts of interest looks like in practice and had some helpful ideas like granting waivers in limited amounts (and making them public) depending on the type of conflict, asking that individuals sell stock or divest property to avoid conflicts, excluding members with conflicts from certain discussions and voting, or allowing for a review of potential conflicts of interest to be discussed at the beginning of each meeting. The committee also suggested that a statement be added to the final DGAC report to review how biases and conflicts of interest were managed throughout the advisory committee’s work.

Overall, the report managed to cover most of the recommendations I made in my public comment, but one thing that I hope the committee explores in its future deliberations is the prevention of undue influence from department leadership after the DGAC report has been submitted, since that is where the translation of science into policy is most critical. DGAC is solely advisory and should not have a role in writing the final Dietary Guidelines report, but it would be appropriate for former DGAC members to have a role in peer review and to make sure that the report language fairly considers the best available science and aligns with DGAC’s recommendations. This last part of the process proved to be controversial in the most recent version of the Dietary Guidelines when the DGAC recommended that environmental sustainability concerns be included in the DGA because the overall body of evidence points to a dietary pattern higher in plant-based foods, and lower in meat, but the final report did not include these important concerns.

NASEM should follow its own advice on conflicts of interest

In light of this report, it seems that NASEM should follow its own advice as it considers itself to be a purveyor of nonpartisan, objective guidance for policymakers, but has been recently scrutinized for conflicts of interest on its own panels. This past December, the New York Times reported that NASEM put together a committee of 13 scientists to make recommendations on regulation of the biotechnology industry, and failed to disclose the clear conflicts of five of the committee members. In fact, the majority of committee members had conflicts (7 out of 13), and the NASEM study director was applying for a job at a biotechnology organization while he was putting together his recommendations for committee members. If that isn’t egregious enough, three of the committee members he recommended for the NASEM biotech panel were actually on the board of the organization at which he was seeking employment. This level of undisclosed conflict is completely inappropriate and should have been caught in the early stages of the committee selection process, not uncovered after the final report had already been released. NASEM should strive to “promote diversity of expertise and experience,” as the committee identified as a core value, rather than stack committees with individuals that have similar industry experience and connections.

Ode to independent science

Independent science at its core must be free from undue political or financial pressure. We of course acknowledge that all policy decisions are not made based on science alone, but in order to create the best possible government policies, the relied-upon science must be independent. We appreciate the work that this committee is putting into advising DGAC on how best to ensure the process facilitates truly objective science advice, because FACs are vulnerable to politicization or interference if not carefully managed. This report should be considered by all federal agencies and other entities, including NASEM itself, that seek to provide scientific advice to policymakers for the benefit of us all.

Restoring America’s Wetland Forest Legacy

Like many white, middle-class, suburban kids, I grew up with one foot in the forest. To me, that small woodlot, a green buffer along a half-polluted tributary, was a paradise unmatched by any other forest in the world. Unfortunately, like many other tracts of land across the United States, my childhood forest is gone—cleared for a housing development.

Wetlands, including wetland forests, are the “filters” of our natural system, combating pollution, removing excess nutrients, and securing fresh drinking water for surrounding and downstream communities. Photo: Dogwood Alliance.

Wetland forests offer massive economic benefits

Even small forests across the United States work to provide “ecosystem services”—non-monetary benefits like clean water, clean air, carbon sequestration, hunting, fishing, and yes—recreation for children. Ecosystem services may sound like “lip service” to the natural world, but it’s not. New York city chose to spend $500 million to protect and preserve its upstream watershed (and resulting water quality), to avoid the $3-5 billion price tag of a new water supply system on the Hudson river. Forests in the U.S. offset about 13% of our yearly carbon emissions. In 2002, southern forests supported over a million jobs in the recreation/tourism sectors, generating $19-76 billion dollars in annual revenue. All of these services require healthy, standing forests across the landscape.

As our country continues to grow, we are increasing the pressures on our forests. We need clean air and clean water, but we also need wood products, food, and housing. As Research Manager at Dogwood Alliance, I work every day with other organizations and communities to improve the quality and quantity of southern forests. Much of my day-to-day is focused on coordinating and organizing a new initiative, the Wetland Forest Initiative, to conserve, restore, and improve southern wetland forests.

Cypress-tupelo forests, also known as bottomland hardwood forests, can occasionally have trees live for over a thousand years. Photo: Dogwood Alliance

Wetland forests are the best of both worlds. You can visit during a dry season to walk beside ancient trees, or explore during the wet season by kayaking in submerged habitat, teeming with aquatic invertebrates, migratory birds, fish, reptiles, and amphibians. “Wetland forest” describes so much of the American landscape—from forests edging creeks and the culturally treasured bayous; to coastally influenced forests, which somehow survive the onslaught of the ocean. Wetland forests span 35 million acres across 14 southern states, and provide twice the ecosystem services value of upland forests.

Taking action to save our wetlands

Yet, with a majority of wetland forests lost—cleared for agriculture, drained for commercial or residential development, even cut and converted to fast-growing commercial pine plantations—we are at a fork in the road. Will we allow our wetland forests to dwindle to less than one percent of their original range, like we did with longleaf pine? Or will we take action now to conserve these vital ecosystems, before it’s too late?

Wetland forests are home to many endemic species, found nowhere else on earth. This photo was taken during a flyover search for the swallow-tailed kite, a bird native to southern wetland forests. Photo: Maria Whitehead

The Wetland Forest Initiative is working to conserve, restore, and improve these habitats. In special places, we will work to protect the legacy of rare, threatened, and endangered species, ensuring that they will have habitat for decades to come. In places where wetland forests have been degraded by lack of management, changes in hydrology, or pollution, we will work with local groups and governments to restore the land to ecological function. Beyond the tree line, we will work with politicians and government agencies to ensure that landowners are awarded with fair compensation for their restoration and conservation efforts. And perhaps most importantly, we will work with communities, to educate them about the beauty and importance of what’s happening on the ground in their local wetland forests.

Although I never thought I would leave academia, I am happy to spend my working hours on a project that has the potential to impact 35 million acres across 14 states. Despite the differences in opinion that some of our member organizations may have, it is inspiring to see so many people from different walks of life (academic, community, environment, forestry, government, land management, landowners, social justice, and tribal) come together and create meaningful change. I am excited for the future of our southern wetland forests.

I encourage you to head over to the Wetland Forest Initiative website to learn more, endorse the platform, and get your organization or university involved.

Sam Davis is Research & Program Manager at Dogwood Alliance. A life-long treehugger, Sam earned a Ph.D. in Environmental Science in 2015 at Wright State University, and completed a postdoc at University of California Merced before leaving academia for greener forests. Sam is thrilled to be translating science into action with Dogwood Alliance. On the weekends, Sam enjoys hiking, home improvement, and gaming with friends and family.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Oregon’s Climate Check-Up Offers Serious Prognosis Without Preventative Action

Each January, I journey to my doctor’s office for my annual physical. She briefly reviews my medical history before conducting an examination, and we end our visit by discussing key risk factors and a plan to manage them.

Well, just in time for the start of the 2017 legislative session, Oregon received its periodic “climate physical.” The results are sobering, and the treatment plan involves further action to put the Beaver State on the path to a low carbon, climate-resilient economy – a path to good “climate health.”

Oregon, like other states, is already experiencing climate change

The Third Oregon Climate Assessment Report by the Oregon Climate Change Research Institute (OCCRI) incorporates findings from recently published studies on climate science and impacts in Oregon.

Hotter and drier conditions caused by climate change contribute to increased wildfire risks and other key impacts in Oregon. Source: UCS

The legislatively mandated report reaffirms what scientists have been telling us. Oregon is already experiencing the impacts of climate change, and human activity has played a key role. It’s a stark contrast to statements by several of the Trump administration’s cabinet nominees.

According to the authors, global emissions of heat-trapping gases are largely responsible for the overall increase in average annual temperatures in the Pacific Northwest over the past century. (Yes, despite an unusually cold winter, the statewide average temperature for 2016 was still much warmer than average.) They found additional signs of human-caused global warming in the 2015 record-low snowpack, more acidic waters off the Oregon coast in 2013, and wildfire activity over the past three decades.

A future of more extremes in every region of the state

Oregonians will face more severe impacts in the future if we continue on our current global carbon emissions trajectory. As shown in the table below, annual temperatures could increase by an average of 8 degrees by the century’s end compared to the late 20th century.

Average temperatures will continue to rise in Oregon compared to the late 20th century under both low and high emissions pathways. Source: Oregon Climate Change Research Institute

Rising temperatures will mean a shrinking snowpack, earlier snowmelt, and diminished summer water supplies as well as increased wildfires and more acidic oceans that affect coastal ecosystems. Sea-level rise will lead to more coastal flooding and erosion. There also will likely be overall negative impacts to agriculture over time.

The 100+ page report provides detailed information and projections for each of these impacts. One of the most striking findings is that higher temperatures and a record-low snowpack despite normal precipitation levels – conditions that led to the devastating 2015 snow droughtcould become commonplace by mid-century.

Another key takeaway is that climate change will affect every region of Oregon. It will also disproportionately impact tribal communities, as well as low-income and rural residents and communities of color. The assessment divides the state into four regions, with snapshots of anticipated climate impacts over the rest of the century:

  • The Coast: Due to rising sea levels, thousands of homes and more than 100 miles of road face a greater risk of inundation. Warmer and more acidic oceans will affect near-shore fisheries and hatcheries, endangering the local shellfish economy and the workers who rely on that industry. Wildfires in coastal forests will likely become increasingly common as well.
  • The Willamette Valley: Heat waves will grow in frequency and intensity as temperatures continue to climb, increasing heat-related illnesses and deaths among the region’s residents. Studies project increasing summer water scarcity and growing wildfire risks that could significantly expand burn areas.
  • The Cascade Range: Precipitation will increasingly fall as rain instead of snow, affecting the ski industry and water supplies. At the same time, forests will likely become even more vulnerable to wildfire, insect infestations and disease. Increased risk of wildfire-related respiratory illnesses is a key health concern for Jackson County.
  • Eastern Oregon: As snowpack shrinks, water supply will be a concern, especially for residents in the John Day basin with no man-made water storage capacity. Drought is a key health risk for Wasco, Sherman, Gilliam, and Crook counties. The Blue Mountains will also likely experience higher tree mortality and wildfire activity.
Ambitious climate action is the prescription

The Third Oregon Climate Assessment Report includes good news for Oregonians. The worst climate impacts can be avoided through ambitious efforts to curb global carbon emissions.

The Beaver State has already taken significant steps to decarbonize its economy, yet it’s still not on track to meet its near-term 2020 emissions goal. Two key next steps for Oregon are ensuring that any transportation funding package helps reduce global warming emissions from the transportation sector, and putting a price on carbon. A carbon price is an important tool in the overall portfolio of critical policies for cutting heat-trapping pollution.

The Oregon legislature should show continued leadership by heeding the experts’ prognosis and taking further preventative climate action today to ensure its climate health tomorrow!

UCS Oregon Climate Change Research Institute

Pages