Reinventing Biotechnology Regulations
Author
Barbara J. Evans - University of Florida
University of Florida
Current Issue
Issue
2
Parent Article
Barbara J. Evans

Clayton Christensen’s theory of disruptive innovation saw new technologies as mere enablers of transformative change. Making a splash requires two more things: business model innovation and new value networks to deliver the technologies in ways that are profitable, affordable, safe, and accessible to consumers. New business models require new regulatory models. Gene editing promises to cure rare forms of hereditary blindness but at prices above $500,000 per eye, if approved using FDA’s last-century biologics regulations. Can the blind afford that?

Instead of passing biotechnology-specific legislation, the United States adopted a Coordinated Framework for Regulation of Biotechnology in 1986, with revisions in 1992 and 2017. The CF tapped existing federal regulators like the Food and Drug Administration, Department of Agriculture, and Environmental Protection Agency to oversee new biotech products using legal powers they already had. The CF agencies are safety regulators, protecting consumer, patient, environmental, agricultural, workplace, and other types of safety. The CF ignores other regulatory concerns, including infrastructure policy.

Regulatory scholar Jose Gómez-Ibáñez conceives “infrastructure” as “networks that distribute products or services over geographical space.” Biology-based manufacturing demands diverse new infrastructures: data commons, facilities, and service networks, some exhibiting economies of scale that make shared assets superior to fragmented efforts.

The United States relies on private-sector infrastructure but subjects it to economic regulations that incentivize investments, promote responsible operation, and capture economies of scale while controlling monopolies to ensure fair access and pricing. Our current infrastructure regulatory model emerged in 1887 to regulate railroads and fostered the development of national infrastructures as varied as stockyards, telecommunications, electricity, and aviation. Modest reforms late in the 20th century harnessed market incentives to sweeten command-and-control tactics. Can this 150-year-old regulatory model call forth vast new infrastructures to support biology-based manufacturing? Perhaps, but not until policymakers recognize infrastructure as a crucial biotechnology policy issue.

Even as a safety framework, the CF falls short. It is a patchwork of antiquated statutes, some dating to the early 20th century and not designed for today’s biotech industry. There are legal gaps where novel products slip through with no safety oversight. Fortunately, the CF agencies are nimble in interpreting old laws in new ways to enable basic safety and environmental oversight for many products. The real problem is not the occasional gap in an otherwise well-functioning regulatory model, but with the regulatory model itself. The concept of “safety regulation,” as practiced in the last century, was designed for industries that have a fairly small number of large manufacturers selling mass-marketed products and operating at a scale that covers the costs of generating evidence to support detailed premarket review.

Today’s biotech industry upends old regulatory models in ways also seen in the sharing economy, exemplified by people who rent their homes through platforms like Airbnb. When millions of people rent their homes for a night, how can a hotel regulator find them to regulate them? The platform that holds the information regulators need (who rented a room?) does not itself provide lodging and is beyond the reach of hotel regulators. New biotechnology business models pose analogous challenges for CF agencies like the FDA. Business functions once integrated together are split across multiple players, some beyond the reach of CF regulators. It is neither practical nor cost-effective to inspect thousands of decentralized facilities and service providers. Ponderous regulatory reviews of the past could destroy the economic viability of precision-manufactured products.

Regulatory models tailored for the last Industrial Revolution will not fit this one. The CF was a bridge. Biotechnology crossed it to a new shore. New models and types of regulation are now required.

Need a National Program to Scale Up
Author
Douglas Friedman - BioMade
BioMade
Current Issue
Issue
2
Parent Article
Douglas Friedman

Imagine a world with a closed loop of sustainably manufactured products, all within 200 miles of your home. Now imagine those products are made with waste from local communities and organic matter that would otherwise be landfilled. This is the future that will be enabled through distributed industrial biomanufacturing.

The bulk of industrial chemical production (by volume) in the United States comes as byproducts of energy-intensive petroleum refining. These chemicals are produced in billion-dollar refineries and then transported throughout the country to manufacturers that bring higher-value goods to market. The drivers of this system are inefficient. Fuel consumption, not chemical use, drives energy markets, and shortages and surpluses are commonplace with decoupled supply and demand.

Distributed biomanufacturing can help avoid these problems. First, with the capital cost of a biomanufacturing facility a fraction of that of a petroleum refinery, we can build them in a distributed network throughout the country — bringing jobs to communities. Just think of the number of craft breweries that have appeared in nearly every major city over the last two decades. Manufacturing facilitates can use local feedstocks that don’t need to be shipped long distances: corn in the Midwest, sugar beets in Michigan, switchgrass in Virginia, and almond hulls in the central valley of California. Even trash from municipal waste is contemplated as a biomanufacturing feedstock. These facilities then produce products for businesses in the region and can be directly responsive to local demand.

Three things are holding us back. One is an inability to predictably reach commercial-scale production. A second is a lack of a national bioeconomy strategy. Last is a state of regulatory confusion.

Biology can be used to make almost any molecule imaginable, but only ounces at a time. Commercial industrial products often require tons of material. The Pentagon recently awarded a $275 million cooperative agreement to establish a nonprofit bioindustrial manufacturing innovation institute; I am proud to be CEO of this creation. BioMADE will focus on developing technologies necessary to achieve scale more predictably as well, through investments not only in discrete scale-up opportunities but downstream processing and data analytics. Companies, universities, nonprofits, as well as environmental health, safety, security, and other professionals will work together to establish an ecosystem responsive to the call.

Predictable technology development is not enough; a national strategy is critical. To get there, the federal government should develop a National Bioeconomy Strategy focusing on transitioning our world-leading biotechnology capabilities from research into economic development. In the last Congress, the Bioeconomy Research and Development Act of 2020 passed out of Senate committee. This legislation would have established an initiative and created an office to coordinate these national objectives. The new Congress should take up these ideas again and establish a national program to expand biotechnology for the bioeconomy.

Finally, the regulatory environment for products of biotechnology is confusing and a challenge for new entrants into the market. The Coordinated Framework for the Regulation of Biotechnology, which explains the complex set of laws and regulations that apply to biotechnology, was originally published in 1986. Despite the efforts of the executive branch to clarify the legal environment for biotechnology products, the underlying legislation has not kept up with advances in science and engineering and should be reviewed. I make no statements about what the results should be other than it be clear to companies in the industry so they can develop their products within a known set of parameters.

Bioindustrial manufacturing is at an inflection point. Global competition is peaking, and the United States is well positioned to compete in the new bioeconomy, but it will take concerted action to do so. It is not too soon to start.

Bioengineering the Future
Author
David Rejeski - Environmental Law Institute
Mary E. Maxon - Lawrence Berkeley National Laboratory
Environmental Law Institute
Lawrence Berkeley National Laboratory
Current Issue
Issue
2
Bioengineering the Future

In 1898, the British chemist William Crookes gave a talk before the British Association for the Advancement of Science entitled simply, “The Wheat Problem.” Crookes is best remembered for his work on vacuum tubes, and lenses that were precursors to today’s sunglasses, so his focus on wheat production probably startled his audience. Especially since his thesis was alarming: wheat was extracting more nitrogen from the soil than we could replenish, which resulted in ever lower yields and “a life and death question for generations to come.”

It took another decade, but in 1908, the German chemist Fritz Haber (later referred to as the “father of chemical warfare”) provided a solution to the wheat problem by demonstrating that ammonia, the main component for nitrogen fertilizers, could be synthesized. The manufacturing of ammonia for fertilizer is one of the great innovations of the 20th century. Some researchers estimate that its introduction in agriculture has since supported over 40 percent of global births.

But, as has been the case for many technological leaps, there were downsides. Today, the synthesis of ammonia accounts for a quarter of the annual greenhouse gas emissions of the entire chemical sector, as well as increasing nitrogen pollution of waterways through agricultural run-off. Other options are being explored — from synthesizing ammonia using plasma to low-temperature electro-catalysis — but the most intriguing solution is biological.

Some plants, mainly legumes like beans, have microbial partners with an amazing capability to extract and “fix” nitrogen directly from the atmosphere for immediate use by plants. What if that genetic function could be transferred directly to plants like corn? And that is exactly what is happening. Hundreds of millions of dollars are being poured into new approaches, and firms like PivotBio are making plants that they hope in the future will be self-fertilizing, addressing both environmental and food security challenges.

Over the last decade over $12 billion has been invested in new biotech startups and existing companies, with around $4 billion put forward in 2018 alone. The pandemic has riveted our attention on health care applications, but as a recent report from McKinsey notes, “More than half of the potential direct economic impact from biological technologies . . . is outside of health care, notably in agriculture and food, materials and energy, and consumer products and services.”

Some of these emerging applications you may have already heard about, or even tasted. Memphis Meats and Mosa Meat are growing beef, pork, chicken, and even duck meat from cultures in the lab, just two of the over 80 companies now working on cultured meat and seafood protein products using a process broadly referred to as cellular agriculture. These approaches are being applied to a broad spectrum of dietary products. Finless Foods, for example, is applying cellular agriculture technologies to grow fish cells in the lab. It isolates cells from fish tissue, feeds the cell cultures with nutrients to grow and multiply, and structures them into seafood products — all in local facilities, which further reduce transportation-related environmental impacts.

As another example, researchers at the Joint BioEnergy Institute, funded by the Department of Energy, have recently developed a plant biomanufacturing platform that was used to synthesize a new-to-nature biopesticide with novel antifungal properties. This suggests that plants can be used to sustainably manufacture molecules not possible with traditional chemical methods.

This all is the tip of the revolution in what is termed engineering biology and signals a shift from chemical to biological synthesis — to a new manufacturing paradigm. An inventory maintained by ELI to track emerging biotech products and applications now contains over 300 examples stretching across almost two dozen categories, from food to fuel to threat detection.

People are beginning to build with biotechnology. The sustainable building materials startup bioMASON injects microorganisms with sand in an aqueous solution to create bricks and other construction materials, a process that is not only faster than the traditional kiln-fired process, but it also releases no carbon because it does not require fuel or heat. Traditional brick making not only emits CO2 and other gases into the atmosphere, but often involves the removal of agriculturally productive topsoil. That can reduce agricultural yields by 60-90 percent. Another innovative and sustainable materials startup, Cruz Foam, uses one of the most abundant natural polymers on Earth, chitin from shrimp shells, to sustainably manufacture packaging materials, automotive parts, and consumer electronics.

Novel solutions to tackle indoor air pollution are in the pipeline. Researchers at the University of Washington have inserted a mammalian gene (CYP2E1) into ivy plants to increase their detoxifying potential. The gene “codes” for an enzyme that breaks down some of the volatile organic compounds found in homes. The researchers estimate that a biofilter made of these genetically modified plants could deliver clear air at rates similar to commercial home filters.

Next-generation biotech firms are exploring new avenues to address old, intractable environmental challenges. A new effort at Allonnia, backed by Gates Ventures and the Battle Memorial Foundation, will search for enzymes or microbes that could tackle the long lasting risks from so-called “forever” chemicals — per- and poly-fluoroalkyl substances found in thousands of nonstick, stain repellent, and waterproof products.

Biotech is starting to provide promising solutions aimed directly at the global carbon cycle that could help address the 37 gigatons of carbon released annually into the atmosphere — creating carbon-neutral or de-carbonization options for a number of economic sectors, such as agriculture, construction, and some forms of transportation — aviation, for example — that are less amenable to the adoption of traditional carbon-neutral strategies. Aviation currently accounts for 2 percent of global carbon emissions. Unfortunately, plane fuel weight restrictions eliminate many of the other carbon-neutral options being considered for the transportation sector, such as electric motors or fuel cells. But researchers at the University of Manchester in England have re-engineered the genome of a bacterium (Halomonas) that grows in seawater to produce next-generation bio-based jet fuels.

Research is also targeting direct interventions in the carbon cycle, by increasing the carbon capture efficiencies of plants and trees. Today, around 120 gigatons of carbon is removed by terrestrial photosynthesis on an annual basis. So even small improvements could have large impacts on carbon removal while simultaneously improving crop yields and food security. Research is underway to redesign photorespiration and CO2 fixation pathways, optimize light reactions during photosynthesis, and transfer carbon-concentration mechanisms from algae and bacteria into other plant chloroplasts.

Biotech is creating new avenues for climate change adaptation — for instance, the engineering of drought- and disease-resistant crops. Researchers at the Innovative Genomics Institute at Berkeley have developed cacao plants engineered to thrive as the climate warms and dries the rain forests where they normally grow the crop. As many as 50 million people worldwide make their living from the industry.

Long term, biology can be a key to creating a circular economy, where decentralized and distributed biomanufacturing systems are designed to use a variety of inputs. These include chemicals from industrial off-gases; syngas generated from municipal solid waste, organic industrial waste, forest slash, and agricultural waste; or reformed biogas. These systems provide a variety of outputs, from fuels to food or vaccines. This kind of production flexibility is one objective of the new BioMADE initiative developed by the Department of Defense and the Engineering Biology Research Consortium. The seven-year award includes $87.5 million in federal funds and is being matched by more than $180 million from non-federal sources, including state governments.

This future rests on the increasing ability to engineer biology to enable what researchers at the firm Zymergen have coined biofacturing. Jason Kelly, the CEO of Ginko Bioworks, predicts, “As we get better at designing biology, we’ll use it to make everything, disrupting sectors that the traditional tech industry hasn’t been able to access.”

Old biotech was messy, expensive, and imprecise. It would often take large companies hundreds of millions of dollars and years to change the properties and behavior of one molecule. No more. To paraphrase Stanford University economist Paul Romer, the new biology is about better recipes, not just more cooking.

Today’s biology goes beyond the “study of complicated things,” as the British evolutionary biologist Richard Dawkins once put it. Over a decade of significant investments by organizations like the National Science Foundation, the Department of Energy, and the Defense Advanced Research Projects Agency have turned biology into what some have termed a Type 2 innovation platform, similar to the Internet, which “consists of technological building blocks that are used as a foundation on top of which a large number of innovators can develop complementary services or products.” Think of today’s biology not as a science, but as a precision-manufacturing platform — digitally interconnected, increasingly automated, flexible, and cost-effective.

These novel biological engineering approaches share one critical characteristic — the ability to run experiments quickly, testing hypotheses, learning, adjusting — what some have termed the Design-Build-Test-Learn cycle. Making things faster has been lauded as the single most important determinant of manufacturing productivity and was historically a critical focus of companies such as IBM (via Continuous Flow Manufacturing), Motorola (Short Cycle Management) and Westinghouse (Operating Profit Through Time and Investment Management). Jack Newman, a co-founder of the biotech firm Amyris, observed that the DBTL cycle “was transformational, allowing the operational translation of fundamental science into stuff.”

These new capabilities have spawned radically new business models, allowing the disaggregation of the historical value chains that have long dominated medical and agricultural biotech. This is happening even at a time when large first-wave biotech firms are tending toward consolidation, bordering on monopolistic aggregation, such as the recent mega-merger of Monsanto and Bayer. But simultaneously, what some term de-verticalization is creating viable business niches in new economic ecosystems, where many new firms work to design the molecules that can be scaled by larger firms downstream in the value chain.

But going to scale remains a large challenge facing the bioengineering community. This will mean moving from a few milligrams of a novel microbe in the lab to kilograms, kilotons — and beyond in the case of commodity products. Going from lab to commercial-scale production will require a bridge, a distributed and sharable infrastructure that can be co-developed with industry. It will need a new workforce with the necessary skills to engineer large-scale, distributed, and flexible production facilities and the ability to build life cycle and sustainability considerations into manufacturing processes and their associated supply chains.

And going to scale with potentially hundreds or thousands of large-capacity bioreactors will bring the new biotechnology face-to-face with the public and media, raising questions about safety, security, and governance. Moving forward, there is an urgent need for regulatory and policy reinvention. There is an old adage in Silicon Valley that innovation requires a combination of “rich people,” “nerds” and “risk taking.” That may not be enough. There are some important ways in which biology differs from other innovation platforms. The most crucial are the regulatory, security, and public perception barriers that may hinder the introduction of new products into the market.

Regardless of these challenges, over a decade of progress and emerging business opportunities have motivated many countries to develop bioeconomy strategies designed to expand their industrial base and accelerate the commercialization of biotech innovations. There are now nearly 60 bioeconomy strategies for nations and for a number of macro-regional areas like the European Union and East Africa. Thousands of people now attend the biennial Global Bioeconomy Summit held in Berlin (virtual this year). The United States was an early leader, developing a government-wide National Bioeconomy Blueprint in 2012 under the Obama administration. It emphasized the role of the biosciences and biotechnology in creating new economic opportunities.

The 2012 Blueprint was the first and for the better part of a decade the only bioeconomy strategy that featured biotechnology as a critical platform technology to drive economic benefits in the biomedical, agricultural, environmental, energy, and industrial sectors. The Blueprint promotes making strategic and non-overlapping research and development investments, facilitating transitions from lab to market, increasing regulatory efficiency, enabling public-private partnerships, and supporting strategic workforce development. In the years that followed the release of the Blueprint, the Obama administration realized a number of outcomes relating to all five of its strategic objectives.

For instance, significant research investment enabled the discovery of CRISPR/Cas9, which became a genome-editing technology that has significantly accelerated the ability to quickly and precisely edit genomes of microbes, plants, and animals. The Department of Agriculture expanded the BioPreferred Program, the federal biobased procurement system that aims to provide market certainty for the growing industry sector. Then in 2015, Executive Order 13693, titled Planning for Federal Sustainability in the Next Decade, required federal agencies to set biobased procurement targets. The Office of Science and Technology Policy convened the Food and Drug Administration, EPA, and USDA to execute the 2017 Update to the Coordinated Framework for the Regulation of Biotechnology, aimed to increase transparency, ensure safety, streamline regulatory processes, and accelerate the translation of bioinventions to market. There was also a successful public-private partnership between LanzaTech and Pacific Northwest National Laboratory that resulted in the development and testing of the first bio-jet fuel, used to power a Virgin Atlantic Airlines flight from Orlando to London. Finally, in addition to launching a technical roadmap in 2019, progress has been made toward the Blueprint’s workforce objective through a public-private partnership known as the Engineering Biology Research Consortium, which established a four-month industry internship program for Ph.D. candidates to help train the next generation workforce for engineering biology.

Since the National Bioeconomy Blueprint was released, a number of additional important advances have occurred. In 2019, the House of Representatives passed legislation, the Engineering Biology Research and Development Act of 2019, with the aim of directing the Office of Science and Technology Policy to implement a national engineering biology research and development program that would coordinate relevant federal agency investments and activities. The Senate followed with the Bioeconomy Research and Development Act of 2020, with a similar aim. Also in 2020, the National Academies for Science, Engineering, and Medicine released a study, “Safeguarding the Bioeconomy,” that articulated — for the first time — the value of the U.S. bioeconomy, which it estimated at $959 billion annually. The report argued that the United States needs a White House-level standing committee of scientists, economists, and national security experts to develop a strategic plan to promote and protect the United States’ biology-based industry.

These actions portend a future wherein a strategic, coordinated federal effort is possible. Toward this end, additional steps are needed. For instance, the Biden administration should consider creating an office to coordinate interactions between the government and businesses, large and small, on bioengineering. It should be a one-stop shop — similar to what the National Nanotechnology Coordinating Office did for the National Nanotechnology Initiative.

To realize a strategic, coordinated U.S. bioeconomy, policymakers will need to advance not only authorization for a national engineering biology research and development program, but also appropriations to fund it. Any appropriations should be linked to regular evaluation of program impacts and proactive anticipation and management of emerging risks to help ensure public confidence in new and novel products and applications. A recent meta-analysis of the national bioeconomy strategies found that, “Only a minority . . . even mention the potential negative consequences of bio-based transformations.”

Significant strategic infrastructure investments are needed. For example, a new constellation of state-of-the-art, networked biomanufacturing facilities, positioned near sources of biomass, could not only maximize the use of renewable resources but also create high-tech jobs in rural areas. Facilities in Iowa, for instance, could use agricultural waste from corn as a feedstock, those in southeastern states could utilize switchgrass, and coastal production plants could take advantage of marine species such as seaweed and various kelp varieties. This biomanufacturing “commons” could also serve to reduce greenhouse gas emissions and the generation of toxic waste as compared to traditional chemical manufacturing. And it would create value from problematic wastes such as forest slash and agricultural residues.

Building on the progress started by the National Bioeconomy Blueprint developed during the Obama administration, the incoming Biden team has a tremendous opportunity for a renewed commitment to the U.S. bioeconomy as an important pillar of its commitment to climate action. Its new “Made in All of America” effort is aimed at revitalizing domestic manufacturing with inclusive policies and environmental stewardship,

Working together with the 117th Congress, the new administration has potential to realize a Clean Manufacturing Act, aimed to mobilize the diverse talent of the American workforce, accelerate sustainable manufacturing innovation, maximize the use of the billion tons of sustainable, renewable biomass the United States has the ability to produce, and significantly reduce negative environmental impacts of manufacturing.

As nearly sixty countries around the world try to refine their bioeconomy strategies to include biotechnology to help reboot economies crippled by the coronavirus pandemic, the United States has little time to waste in developing strategies to keep its leadership position in biomanufacturing. Over a decade ago, Neri Oxnam at MIT’s Media Lab observed that “the biological world is displacing the machine as a general model of design.” That revolution has happened. The future of manufacturing has arrived. TEF

COVER STORY A sustainable, circular economy may depend on solutions coming from life itself. So think of today’s biology not as just a science, but as a precision-manufacturing platform — digitally interconnected, increasingly automated, flexible, and cost-effective.

Oil and Toil, Double Trouble
Author
James Barrett - Barrett Economics
Barrett Economics
Current Issue
Issue
2
Oil and Toil, Double Trouble

Perhaps more than any other energy source, oil is an icon. While we use lights, heating, and cooling on a daily or hourly basis, for most people oil (in the form of gasoline) is probably the only fuel we actually ever handle directly. And the same applies for knowing the price of our energy sources. We confront gas prices every time we fill up our cars, or drive by a filling station, and for those of us old enough, there are memories of the shortages and long lines at gas stations in the 1970s. Our connection to oil is so strong that rising prices can influence presidential elections. Not coincidentally, politicians at almost all levels have some position on where, how, and whether we should drill for more in an effort to lower costs and boost job growth.

The reason that oil plays such a central role in our energy system is its dominance in moving goods and people. It is better than most alternatives in being easily portable in large enough quantities to get to a destination, and service stations dot the roadside, providing quick and easy refilling. The transportation sector accounts for over 70 percent of all the oil consumed in the United States, and oil provides over 95 percent of the energy used in transportation. There is no other fuel or energy source that dominates such a large swath of our economy. And not only is oil the lifeblood of our transportation sector, oil markets can have deep and wide-reaching impacts on the society as a whole. Swings in oil prices affect the national economy at multiple levels, from job growth to air travel and family trips to household budgets and the cost of shipping from online vendors.

It should be no surprise then that turmoil in oil markets can cause turmoil in the larger economy. Since 1972, almost every major spike in unemployment has been preceded by a major spike in oil prices. While correlation is not the same as causation, and it can be difficult to tease out how much of a role these spikes in oil prices have played in causing unemployment, there is no doubt that they have at the very least helped cause, contributed to, and exacerbated periods of high unemployment over the last five decades. Even for the financial crisis of 2008 and the recession that followed, which have their clear roots in financial and real estate markets, there is credible evidence that points to a link between spiking oil prices and the beginnings of the housing crisis.

What consumers likely remember most from these episodes are high prices at the gas pump, but what is important to recognize about the impact of oil prices in terms of the economy as a whole is the fact that the issue is not the level of prices themselves. Between January 1974 (the earliest date for which monthly oil prices are available) and December 2019, before the pandemic hit, the unemployment rate averaged 6.3 percent. Over those 46 years, the unemployment rate either hit or passed through that average eight times, at least once each decade. In those months of average unemployment, the average price of oil (as measured by composite acquisition costs at U.S. refiners) was about $56 per barrel in 2019 dollars, which is not far from the average oil price over the entire 46 years of about $54.

However, the range of oil prices during months of average unemployment is anything but average. We need to take a short ride through the data to demonstrate that. As you will see, even over short periods of time, the relationship between oil prices (all prices in this article are inflation-adjusted to 2019) and the contemporaneous unemployment rate has varied substantially. Unemployment was 6.3 percent in February 1978 and again in January 1980, falling as low as 5.6 percent in between. But the price of oil in February 1978 was $39, where it stayed for a year before rising by January 1980 to $68, a huge increase. Oil prices then hit a peak in early 1981, and unemployment rose steadily, peaking at 10.8 percent at the end of 1982, but by then oil prices were back down to $70 and falling. The next time oil prices reached as high as $70 was mid 2005, when unemployment was a lowly 5 percent.

What’s clear from this pattern, or lack of one, is that oil prices and unemployment don’t actually have a clear relationship. The economy can perform well both when oil prices are high and when they are low. This is an important point: While no one particularly enjoys paying more for energy, policies and politicians focused on bringing oil prices down are missing the point.

The real problem with oil, from an economic standpoint, is not price, but price volatility. We make decisions and investments based on current oil prices and our best guess about what we think they are likely to be in the future. When oil prices shoot up over a short period of time, many of those decisions can become uneconomical almost overnight. Since 1970, there have been five distinct periods during which the unemployment rate spiked. In each of these cases, the jobless rate increased by 50 percent or more over a period of two years or less. Each of these spikes was preceded by an increase in oil prices of 100 percent or more within that two-year window. This has occurred when prices jumped from low to moderate levels, as when they peaked at $39 per barrel in mid 1974, and when they went from moderate to high levels, as when they rose from $61 to $150 per barrel in 2007 and 2008.

The focus on policies to lower oil prices is largely an understandable desire to control our own energy and economic fate. However, policies aimed at increasing oil production as a way to control oil process prices are doomed to fail. The ability of oil price volatility to whipsaw our economy combined with the fact that we import so much of it has given some people the idea that the solution is to increase domestic oil production. “Drill, Baby, Drill” can be heard in most every election cycle. However, policies based on that rationale have been around even long before the 2000s. In particular the notion that increasing domestic oil production will ease the pain of oil prices has been used to support a wide array of policy proposals, often against the wishes of the local communities, including increased offshore drilling and the use of fracking, and opening the Arctic National Wildlife Refuge to exploration and exploitation.

The political arguments in favor of increasing domestic oil production is typically couched in terms of “energy independence.” This is a catch-all phrase that isn’t terribly well defined, most likely because it’s not particularly meaningful, at least in this context. That is because energy prices are determined on a world market, as supply and demand work their wonders. Promoters of energy independence seem to be concerned with either economic self-determination, national security, or both. The economic argument tends to be couched in terms of oil prices. It suggests if we only produce enough of our own, we will not be subject to high prices imposed on us by other countries. As discussed above, the absolute actual level of oil prices is far less important to the economy than their volatility. Regardless, if producing more oil domestically would reduce our economy’s exposure to oil price spikes, then the economic stability it provides might help justify more drilling. Unfortunately, the dynamics of oil markets mean that increasing oil production will have little or no impact either on the price we pay or price volatility.

One of the fundamental features of oil markets is their global nature. Some types of oil produced in some regions are better suited for refining into different types of products (gasoline vs. diesel, for example), and demand for those different products differs from country to country so that certain countries are more dependent on oil from specific regions of the world. Despite this, the price of crude in basically all of its forms is set globally. Different types of crude sell for different prices, but they move up and down together depending on worldwide market forces. As a result, there is no direct benefit to U.S. oil purchasers from having more domestic oil in the market. American producers do not give American buyers a discount: Between early 2016 and mid 2018, the United States increased its crude oil output by nearly 25 percent, becoming the largest oil-producing country in the world. Over that same time period, the average price paid by American oil refiners went up by over 130 percent, following the same trend as global benchmark Brent crude, which went up by 121 percent.

The reason for this is that in this world of globally set prices, American producers act as what economists call “price takers,” which means more or less what it sounds like, i.e. that American producers look at what oil is selling for (or predicted to be selling for) and decide how much oil to produce accordingly. This is not true for other major players in the oil market. Russia and OPEC, with Saudi Arabia as their lead actor, look at where oil prices are, where they would like them to be, and calculate how they might change their production to push prices in the direction they prefer. They can do this because they are large producers (Russia, Saudi Arabia, and the United States are the top three in the world, accounting for over 40 percent of production) and because the Russian and OPEC governments can essentially control how much oil their countries produce. American oil production, on the other hand, is determined by how much oil all the individual drillers decide to produce. So while Russian and OPEC production levels are set with a mix of economic and geopolitical targets in mind, American oil production is just the sum of how much each individual company chooses to produce based on what they think is their individual interest.

To see why this matters, we only need to take a look at some recent history: March and April of last year. Before the COVID-19 pandemic really took hold of the global economy, oil prices were falling, and Russia and Saudi Arabia had failed to agree on production limits to prop them back up. Challenging the Saudis’ leadership, Russia decided to increase rather than decrease production, and the price slide continued. In response, Saudi Arabia and OPEC decided that they too would increase production and push prices even lower in an attempt to punish Russia economically and force Moscow back to the bargaining table.

Prices were on their way further down when the pandemic began impacting countries around the world, slowing global economic activity and cutting oil demand sharply just as supplies were rising. This combination of surging supply and shrinking demand sent prices plunging even further, with oil futures briefly turning negative. The situation was clearly untenable, and Russia and Saudi Arabia got back together over Easter weekend and agreed to steep production cuts, of about 10 million barrels per day, almost a quarter of the combined OPEC and Russian output. Coincidentally, this amount is almost the equivalent of all U.S. oil production in 2018.

Perhaps the most interesting part of this remarkable turn of events is the character player that was largely absent from it, the largest oil-producing country in the world. Though the Trump administration worked through diplomatic channels to try to help make this deal happen, it didn’t actually agree to reduce oil production for the simple reason that it can’t. The U.S. government has no control and little direct influence over the production decisions of the 10 or so major oil companies and the literally thousands of smaller ones that together produce the vast majority of American oil.

This reveals the central problem with increasing oil production to control oil prices or advance our strategic interests: No matter how much oil American companies produce they can’t move prices to suit their own economic interests, far less national geopolitical strategies. In the best of times, this leaves our economy vulnerable to volatile swings in global oil prices that are often unpredictable and always uncontrollable. In the worst of times, it leaves us vulnerable to deliberate manipulation by other global powers who do control their domestic producers, using their oil pumping as a strategic geopolitical tool.

Improving national security through increased oil production is an equally problematic idea. The notion that we should not support unfriendly regimes by buying their oil makes some intuitive sense. Of course, the reason why we import oil (and refined oil products) is that we use more than we produce. In 2018, when the U.S. became the world’s leading producer of oil, we pumped nearly 11 million barrels of crude oil per day. That same year, the United States consumed just over 20 million barrels per day. Clearly, at these levels of production and consumption, we need to supplement our supplies with significant levels of imports. Increasing production would seem to be one way of avoiding that. However, increasing production can have little or no impact on our level of imports: Between 2016 and 2018, as domestic production rose by over 2.1 million barrels per day (a 24 percent increase), our imports of crude oil fell by just over 1 percent.

Part of the reason for this seeming contradiction is the fact that consumption of petroleum products continues to increase. Unfortunately, apples to apples comparisons between consumption and production are difficult because we refine crude into other products of varying densities, so looking at the number of barrels of finished products we consumed for transportation as compared to the number of barrels of crude we produced or imported for all purposes is not as straightforward as we would like.

If increased production had little impact on imports and was not justified by large increases in consumption, it must be explained by the other piece of the puzzle: Oil exports. Over this same time period, as U.S. crude production increased by 2.1 million barrels per day, our exports increased by almost 1.5 million barrels per day.

As with crude prices and volatility, increasing oil production does little to enhance energy independence from a security standpoint. As U.S. crude production increased dramatically in 2016-18, we shipped most of that extra abroad and continued importing more or less as we had before, leaving the main impact on our oil trade an increase in crude exports. With minimal change to import levels, this left us just as dependent on foreign oil suppliers as before. Oil imports from OPEC and Russia did fall over this period, by roughly 625,000 barrels per day (out of total imports of 7.8 million barrels per day), but if the objective of increased oil production is to deprive our geopolitical rivals of oil revenue, it was a failure. Our reduced purchases from OPEC and Russia account for less than 1.5 percent of their total sales, and together they were able to hold their production steady at roughly 48 million barrels a day as global prices more than doubled.

The underlying issue, once again, is that U.S. oil production comes from thousands of individual producers each setting their production levels according to what suits them best economically. OPEC and Russia, on the other hand, set production levels nationally based on a combination of factors, including both economics and geopolitics. Our production levels have little influence on OPEC and Russia’s decisions in the normal economic sense that if we produce more, we will capture more of our domestic or the global oil markets, leaving them to produce less. In fact, the opposite may actually be true: If OPEC and Russia see that increased U.S. production poses a threat to their influence on oil markets, they may actually increase production themselves in an attempt to force us out of the market by lowering prices to levels that are too low for U.S. producers to withstand.

This is exactly the kind of brinkmanship that helped set up the market turmoil this past spring, and while the main combatants in that confrontation were OPEC and Russia, the impacts on U.S. oil producers have been profound. Shale oil companies have cut output dramatically and many are filing for bankruptcy as they can’t continue operations, let alone turn a profit, at current oil prices.

Looking at national security issues more broadly to include our involvement in the Middle East, there is no question that maintaining secure oil supply lines is an important contributor to U.S. strategic interests in the region. Once again, however, the dependence of those interests on our consumption of Middle Eastern oil is unclear. It’s further unlikely that either our support for Israel or the difficulty of its relationships with its neighbors would change significantly if we reduced our oil imports. And as described above, even if it did, increased oil production may have little or no influence on imports.

Even if the United States produced enough oil to meet all of its demand and only bought oil from domestic producers, exposure to price volatility and geopolitical influence would remain as a result of global oil pricing. Even if OPEC could not maintain discipline within itself or forge agreements with Russia to move prices strategically, volatile Middle Eastern politics would still create volatile oil markets that would impact our economy. Increasing oil production could only insulate us from global markets if we banned both imports and exports of oil and refined products. Only a complete severing of our economy from global oil markets, making a separate domestic-only market, would create this effect.

Short of such drastic, and highly unlikely policy, the only way to reduce our dependence on global oil markets is to reduce our dependence on oil, regardless of its source. Given the deep intertwined relationship between oil and our transportation sector, this is a tall order. The list of alternative options is not new or free of complications of their own. An obvious first step is to increase the efficiency of our transportation system. Though far from perfect, Corporate Average Fuel Economy standards have helped slow the growth of oil consumption dramatically. CAFE standards were first implemented in 1978. Between 1950 and 1978, gasoline consumption per dollar of GDP was fairly constant, fluctuating no more than 8 percent above or below its mean of 425,000 barrels per billion dollars of GDP. Since the introduction of CAFE standards, that has fallen steadily to 177,000 in 2019. Of course there are a number of factors that contributed to this large reduction in gasoline intensity of our economy, but there is no doubt that CAFE standards are a major contributor. As far as helping reduce the influence of oil markets on the U.S. economy and promoting energy independence, it is likely that no government policy has contributed more than CAFE standards. Trump’s rollback of CAFE improvements is almost certain to increase our exposure to oil price volatility and the economic swings that come with it.

One way or another, all efforts to reduce our exposure to global oil markets are rooted in reducing our consumption of oil. In addition to increasing efficiency, introducing competitive alternatives to petroleum as a transportation fuel is critical. Oil-based products have the advantage of being energy-dense and a well-established part of our energy infrastructure. Electric cars are an obvious and rapidly growing alternative. They carry with them challenges related to battery range, charging infrastructure, as well as battery manufacture and disposal. Hydrogen produced by renewable electricity can help avoid many of the environmental issues associated with oil, and without the relatively long charge times now associated with electric vehicles, but they face challenges too, including range, infrastructure, and cost as well as how to store hydrogen fuel both at stations and onboard vehicles. Alternative liquid fuels like ethanol, biofuels, and synthetic fuels all require some form of energy feedstock, and face challenges relating to production costs and compatibility with our current energy infrastructure, including engine technology.

Energy independence is a vague and ill-defined notion that has perhaps inevitably yielded ill-conceived policies aimed at achieving it. By focusing on oil prices, rather than oil price volatility and our exposure to it, policies aimed at energy independence are guaranteed to miss their target. By naively targeting domestic oil production as a method to reduce global oil prices, such policies have not only chosen the wrong target but also the wrong tool. In the face of all of this, the fact that U.S. policy is largely ineffective at moving global oil prices is likely to be more of a blessing than a curse. If we could, through increased production, push global oil prices downward, we would likely increase our consumption of oil, making our economy that much more vulnerable to the inevitable price swings like those that have damaged Americans repeatedly over the past half century.

To be free of the vagaries of global oil markets, our economy must be free from oil. Policies that focus on increasing supply at best can only distract us from that central truth. This in turn can only be achieved by continuing to advance alternatives that can meet the demands and particular needs of our transportation system. Until we develop economic and technological competitors to oil, the fundamental dynamics of global petroleum markets will remain — as will the futility of the intuitive but wrong-headed notion of drilling our way to freedom. TEF

CENTERPIECE The data show U.S. employment and the overall economy can perform well when the price of crude is high as well as when it is low. What upsets policymakers and producers is price volatility, and there are not many ways to insulate the United States from global oil shocks.

Advocating for the Future
Author
John C. Dernbach - Widener University
Irma Russell - University of Missouri-Kansas City School of Law
Matt Bogoshian - University of California, Davis
Widener University
University of Missouri-Kansas City School of Law
University of California, Davis
Current Issue
Issue
2
Advocating for the Future

Carbon dioxide levels in the atmosphere are higher than they have been in three million years. Human influence on the climate is so great that we are probably moving into a new geological epoch, the Anthropocene. The effects of climate change — visible only to trained observers and in computer models several decades ago — are now apparent everywhere. And the best available science tells us we need to reduce greenhouse gas emissions to net zero or below across the globe by 2050, if not earlier, to avoid the worst effects of climate disruption. We have less than thirty years.

In this context, which has no precedent in human history, should lawyers just keep behaving the way we ordinarily behave, counseling clients, drafting legal documents, and litigating cases? Is that good enough? Are we called to do better — to be better?

The severity of climate change effects means that everyone is at risk and everyone should act to combat the threat. While attorneys are not affected more than others, they have the opportunity and, we argue, the responsibility, to confront this existential challenge and help achieve a better future. More than 1.3 million lawyers are licensed to practice in the United States. A fairly small share of these attorneys (about 64,000) identify environmental law as a practice area, according to the Martindale Hubbell directory. What would happen if a larger share of the nation’s million-plus professional counselors takes constructive action to combat climate disruption?

There is widespread recognition that technology developers, political leaders, educators, scientists, corporate leaders, planners, engineers, and activists have important roles to play in addressing climate change. And yet, the crucial role of lawyers is less widely recognized — even among lawyers themselves.

Ultimately, dramatic reductions in greenhouse gas emissions and systematic adaptation to climate change are not likely to occur without new and modified laws. Attorneys are needed to advocate, draft, help implement, and counsel clients about the many laws required at the federal, state, and local level. Legal changes are also needed in private law and governance — including supply chain contracts as well as certification, auditing, labeling, and reporting programs. Business clients and others rely on lawyers for advice on legal compliance, risk reduction, and other decisions that significantly affect the atmosphere’s carbon load. Members of the bar are also in positions of influence in their communities. Leadership by every kind of lawyer is needed — including those in private and corporate practice as well as those in non-profit organizations, academia, and government.

The obligation to combat climate change is also based on the lawyer’s responsibility for justice. The first sentence of the American Bar Association’s Model Rules of Professional Conduct states that an attorney is “a public citizen having special responsibility for the quality of justice.” The worst effects of climate change are, and will increasingly be, experienced by those with the least resources and thus among those with the greatest need for legal assistance. As the climate crisis develops, the profoundly negative effects of climate disruption will increasingly challenge the stability of American democracy, the administration of justice, and the legal system that lawyers are sworn to uphold.

Long-standing rules of professional responsibility indicate that attorneys have a duty to explain climate change-related risks and opportunities to their clients. The Model Rules of Professional Conduct state that lawyers “shall exercise independent professional judgment and render candid advice. In rendering advice, a lawyer may refer not only to law but to other considerations such as moral, economic, social, and political factors that may be relevant to the client’s situation.” All states have adopted this rule. When attorneys conclude that climate change-related risks and opportunities exist, they have a duty to educate their clients about them.

Even though many rightly critique the legal profession as being far too slow in enlisting its members to lead in combating the climate crisis, there are some promising recent signs of progress. In 2019, the ABA’s House of Delegates — its chief policymaking body — adopted a resolution urging lawyers “to advise their clients of the risks and opportunities that climate change provides.” The resolution also urges “lawyers to engage in pro bono activities to aid efforts to reduce greenhouse gas emissions and adapt to climate change.”

The International Bar Association — the global counterpart to the ABA — adopted a “Statement on the Climate Crisis” in 2020 that builds on the ABA resolution. The IBA statement “urges lawyers, acting in accordance with their professional conduct rules and the rule of law, to consider . . . taking a climate conscious approach to problems encountered in daily legal practice.” This includes “advising clients of the potential risks, liability, and reputational damage arising from activity that negatively contributes to the climate crisis,” as well as acting “on a pro bono, volunteer or reduced fee basis, for those negatively affected by the climate crisis.”

The IBA statement also “urges lawyers, as influential figures and thought leaders within society, to live responsibly in the face of the climate crisis” by reducing “their environmental footprint” in “everyday actions” and by “supporting positive changes in the workplace, including adoption of more sustainable practices, such as greater reliance on electronic file storage facilities and digital technologies, more energy efficient offices, and more climate-friendly travel and procurement choices.”

Importantly, the ABA resolution and the IBA statement support changes in law. The ABA urges all levels of government as well as the private sector “to recognize their obligation to address climate change and take action” to “reduce U.S. greenhouse gas emissions to net zero or below as soon as possible, consistent with the latest peer-reviewed science.” The IBA statement “encourages lawyers to engage with current and future legislative and policymaking efforts to address the climate crisis.”

Likewise, many of the mainstream businesses that attorneys represent are recognizing these realities and urging a better response. For example, in 2019 the Business Roundtable formally recognized that the purpose of corporations includes “embracing sustainable practices across our businesses” and that customers, employees, suppliers, and communities are to be considered essential stakeholders. In 2020, it issued a statement urging deep cuts in U.S. greenhouse gas emissions.

All of this suggests that professional and business norms are moving toward recognition that lawyers have a responsibility to combat climate disruption. This growing momentum comes from a variety of sources, including client interest in lawyer advisors who have knowledge and capability relating to climate change and sustainability. In addition, market forces are driving down the costs of clean energy and increasing the need for legal help with clean energy projects. Investors are increasing pressure on corporations to disclose the accelerating risks of climate change, and the public is demanding less-harmful consumer products. The election of Joe Biden and Kamala Harris, who campaigned on the most ambitious plan to address climate change yet put forth by any winning presidential ticket, adds to this momentum.

Law students and young lawyers, who want to make a contribution and to work in law offices or other organizations that value their views and aspirations, are an especially important force for recognition of an attorney’s responsibility. In October, an organization created at Yale, Law Students for Climate Accountability, published a first-of-its-kind scorecard on the role of 100 major U.S. law firms in climate change. It analyzed their litigation, transactional, and lobbying work, and assigned each firm a grade from A to F. These 100 firms, the report concluded, worked on 10 times more cases making climate change worse than they did on cases making it better. Only four firms received an A, while 67 received a D or F.

What, then, can attorneys do more of, or do differently, to combat climate change? The following suggestions are addressed to lawyers as professionals, citizens, family members, and members of various communities. These suggestions are illustrative, not exhaustive. They are intended to be both provocative and constructive.

To start, as part of their duty to clients and society, lawyers should work for better governance. It is impossible to address climate disruption and sustainable development unless we govern effectively at the federal, state, local, and international levels as well as in the private sector. Yet the level of partisan rancor and disagreement, particularly at the federal level and increasingly at the state level, makes it harder to address any significant issue effectively.

Better governance starts with adherence to the rule of law as well as the norms and guardrails that keep the machinery of government, commerce, and our legal system running. These norms include mutual respect, a willingness to seriously consider the views of others, respect for facts and science, civil discourse, and a commitment to the common good.

Lawyers should work toward better modeling of norms like these — in both public and private conversations, directly and explicitly emphasizing the importance of adhering to the rule of law and fact-based decisionmaking. They should maintain civil discourse and identify and even challenge those who reject or undermine foundational norms.

In addition to practicing and modeling civility, embracing sustainable development would help attorneys think more fully and clearly about legal options for the best course of action. Sustainable development is a lens or framework that lawyers can use to address nearly any issue, including climate disruption. The lens enables an understanding of the varied and cumulative risks and benefits inherent in a course of action that a client proposes. Rather than limiting the analysis to surface economic factors and clear legal risks, this approach goes deeper to help clients avoid costs and realize benefits. These benefits include opportunities to improve quality of life and combat the climate crisis. Thus, it is clearly advisable that lawyers use the imperatives and facts behind sustainable development to educate their clients and organizational superiors.

When lawyers use sustainable development as a lens or framework, they see a wider range of legal tools to address a particular issue, including but not limited to environmental law. Because environmental law is necessarily and primarily regulatory, it tends to focus on what a client can or cannot do — including what permits are required, and what enforcement options are available. Ironically, many have come to believe that environmental regulation unnecessarily limits freedom. To the contrary, by protecting us from risks to health and other negative effects of pollution, environmental law helps secure the freedom of every citizen to live a long and fruitful life. The wider lens thus incorporates environmental law and is therefore additive because it puts more options and tools on the table — which is essential if we are to effectively address climate change. Sustainable development can redirect law to foster economic development along a new conceptual framework that treats development, equity, and the environment as mutually reinforcing rather than oppositional, supplementing and transforming traditional regulation.

How do lawyers acquire the skill, expertise, and knowledge to communicate sustainability choices to clients? A starting point is understanding what their clients do, what they want, and what they need. Counselors who do this work listen carefully to clients and learn the essentials of their clients’ business and long-term goals and interests. They use their research and analytical skills to identify sustainability practices and principles that serve client needs and interests, flag negative effects as well as potential opportunities, and articulate legal options that incorporate sustainability principles. As time passes, and lawyers accumulate experience on these matters, they often find that new clients seek them out.

Lawyers can exercise such “thought leadership” without having decades of experience. Leadership, says James Strock, the first secretary of California’s Environmental Protection Agency, is about “inspir[ing] others to alter their thoughts and actions, in alignment with an empowering vision.” Many law students and younger lawyers aspire to make a difference in some leadership capacity. In the fall of 2020, more than 300 students signed up for a class called “Lawyers as Leaders” at Georgetown University Law Center — the most popular class in the school’s 150-year history.

Attorneys must seriously consider working for, and supporting, legal efforts to reduce greenhouse gas emissions and adapt to climate change in their personal and professional capacities. One opportunity grows out of the comprehensive description and analysis of some 1,500 recommended legal tools to reduce greenhouse gas emissions in Legal Pathways to Deep Decarbonization in the United States, which was published by ELI Press in 2019, with Michael B. Gerrard and John C. Dernbach as editors. Federal, state, and local governments, and the private sector, need to adopt and implement such tools if we are to reduce greenhouse gas emissions to net zero or below by 2050.

The next stage of this project is drafting these recommendations as model laws that can be proposed, adopted, and implemented in various jurisdictions. More than twenty law firms as well as individual attorneys, law professors, and others are participating in this project on a pro bono basis. These model laws, in turn, are posted (with other resources) on a website created for that purpose, https://lpdd.org.

A striking aspect of this project is the participation of lawyers at all stages of their careers. Many retired attorneys, some of whom have not previously worked in environmental or energy law, including a judge and a recording industry lawyer, help manage this project. First-year associates at law firms also participate, helping to draft model laws pro bono. The project is an excellent example of an opportunity for lawyers to establish their careers along the path of thought leadership in addressing climate change. At some law schools, including Widener Commonwealth, Denver, and Vermont, students can take classes in which they draft model climate change and sustainability laws for local and state governments.

In addition to drafting model laws, another opportunity to make a difference is getting these or other proposed laws adopted. The Legal Pathways project is active in this as well. This work involves outreach to interested organizations, individuals, legislators, legislative staff, and media. And while Congress is obviously important, so are state legislatures and local governments. In fact, local government is a particularly important avenue for combating climate change and enhancing sustainability because local governments deal with the increasing number of climate-change-induced severe weather events; the need to repurpose commercial property as more people work and shop from home; and growing demand for walkable downtowns, mixed-use neighborhoods, and charging infrastructure for electric vehicles.

An important and sensitive issue is whether, when, and how lawyers should make the case for such actions in moral or ethical terms. To be sure, there are risks of an initial adverse reaction; clients and organizational superiors may have different views. Still, attorneys who listen carefully to their clients or superiors, and who develop good and trusting relationships with them, can carry out their duty to advise by identifying and connecting with specific values to help them maximize enlightened self-interest. Counselors who limit their arguments to law or include only economic, technological, or scientific analysis ignore the persuasive power of widely held ethical and moral norms. Principles such as intergenerational equity that are at the core of sustainable development have dramatic emotional and intellectual power when taken seriously. Pope Francis I explains intergenerational equity in a way that resonates within the Catholic religion as well as other faiths and philosophical traditions:

“Once we start to think about the kind of world we are leaving to future generations, we look at things differently,” the pontiff says. “We realize that the world is a gift which we have freely received and must share with others. Since the world has been given to us, we can no longer view reality in a purely utilitarian way, in which efficiency and productivity are entirely geared to our individual benefit.”

Other widely held values include the injunction to do no harm. Professor Victor Flatt, who teaches at the University of Houston Law Center, summarizes the moral norm underlying environmental law in this way: “no person should be allowed to harm another person for profit or benefit.” Additional powerful norms — many held by some traditional conservatives — include national security, conservation, prudent stewardship, accountability for the consequences of one’s actions, and the precautionary approach. Separately and together they can also help pave the way for political consensus.

Lawyers should consider building such principles into their explanation of various climate change options to clients, and in their advocacy on behalf of clients as well as to their organizational superiors. Again, context and audience matter, and sensitivity and mutual trust are required. Ultimately, clients have the final word, but many care about these principles. (For example, they care deeply about their own children.) Decisions that reduce greenhouse gas emissions and foster sustainability are more likely when decisionmakers understand the relationship between their choices and the principles they hold close.

When members of the bar engage politically, they should include people with different perspectives, building alliances based on a shared understanding of the common good. Sustainable development should be capable of bridging the partisan divide, and there is evidence that it is already working, even where the term is not used. Because sustainable development seeks to further environmental protection, equity, and economic development at the same time, it requires the active and constructive participation of business — something that many in the business community recognize and appreciate. In fact, much environmental lawmaking over the past several decades has been directed at environmentally sustainable economic development. Statutes that require or encourage increased use of clean energy can be understood as climate change laws, but they can also be understood as economic development laws. These laws not only foster business growth, they create well-paying jobs.

Today, lawyers must also “walk the talk” on reduced greenhouse gas emissions and sustainable development. It is one thing to persuade clients and others to reduce their greenhouse gas emissions, and yet it is quite another to do it in one’s own office and at home. Such actions have both persuasive power and reputational benefit.

Attorneys, law firms, businesses, and organizations of all types should consider participating in one or more of the many efforts for walking the talk that already exist. For example, the Law Firm Sustainability Network, a nonprofit organization made up of firms as well as legal departments of major corporations, has launched the American Legal Industry Sustainability Standards, “a self-assessment tool that measures law firms’ implementation of environmentally friendly practices that promote energy efficiency, conservation of energy and resources, recycling, and related measures.” The organization also fosters information sharing on best practices.

The ABA Section on Environment, Energy, and Resources has partnered with LFSN to foster awareness of the standards and programs. The California Lawyers Association offers a set of model law office sustainability guidelines for reduced paper and energy use as well as purchase of more sustainable products and services. In 2018, Lawyers for a Sustainable Economy, a partnership among 14 large private firms, Stanford Law School, and Stanford’s Precourt Institute for Energy, “committed to delivering $23 million worth of free legal services by the end of 2020 to advance sustainability in energy, transportation, and land use.” For lawyers, walking the talk also means making similar efforts at home and through other organizations, and considering climate change and sustainable development in supporting political candidates and community initiatives.

Bar associations can do more to advance this work through resolutions, educational programming, organizing, and advocacy — at the international, national, state, and local levels. In 2009, the Oregon State Bar Association created the Sustainable Future Section — “the first state bar association section devoted to the relationship between sustainability and law.” Organizing and advocacy to accelerate the transition to a decarbonized and sustainable world would be particularly valuable leadership contributions.

Finally, lawyers should spread the word and organize others in ways appropriate to their circumstances. The challenges of climate change and sustainability are considerable, and so are the opportunities. But there is a finite and rapidly closing window for effectively addressing these issues. Like all professionals and all people, attorneys have a responsibility to preserve our planet and our quality of life. In our specific professional roles and, more broadly, as lawyers, we have a special responsibility for the quality of justice and the public good. We need to talk about these issues and how to address them, and encourage other lawyers of all races and backgrounds to participate. We also need to share legal tools and approaches that work, and support each other in doing so. It is not enough to do these things in our work or at home; we should step up our engagement with others.

On December 24, 1968, in lunar orbit, Apollo 8 astronaut William Anders photographed the Earth roughly 240,000 miles away. The picture, dubbed “Earthrise,” shows the grey lifeless Moon in the foreground with the Earth above and behind it — blue and green and alive and surrounded by dark space. More than fifty years later, Earth is still the only place we know where life exists. Notwithstanding remarkable success in implementing environmental laws, we face an existential threat from climate disruption. What will we as lawyers do, or do more of, or do differently, in response? TEF

LEAD FEATURE Attorneys in our varied roles need to step up and address the climate crisis for the sake of every person and for the public good. All lawyers must be sustainability lawyers now.

Modernizing Regulatory Review for Energy, Environmental Policy
Author
Joseph E. Aldy - Harvard Kennedy School
Harvard Kennedy School
Current Issue
Issue
2
Joseph E. Aldy

President Joseph Biden tasked all executive branch agencies to develop recommendations for modernizing regulatory review in one of his very first actions in office. The review of regulatory proposals has included estimated benefits and costs dating back to the Reagan administration. The current framework primarily reflects President Clinton’s 1993 executive order and the Office of Management and Budget’s 2003 guidance on regulatory review.

Prior to the regulatory interruption of the Trump administration, energy and environmental regulations represented more than 80 percent of the benefits and about two thirds of the costs of all significant federal rules. Efforts to modernize review necessarily must account for implications for energy and environmental policy. Recent advances in economic data, methods, and research can inform such efforts and ensure an evidence-based foundation for future regulatory evaluation.

Since the intent of review is to improve information about regulatory proposals, a value of information lens can guide modernization. The government could prioritize those changes to regulatory review that would yield information that delivers benefits that justify data costs — the information that policymakers and the public would find most useful in understanding the impacts of regulatory proposals. Let me offer three illustrations.

First, quantifying the economic benefits and costs of regulations could inform not only a determination of whether the benefits justify the costs, but also the distributional impacts of rules. The increasing interest in environmental and energy justice highlights the benefits of understanding the distribution of who benefits from and who bears the costs of proposed regulations.

The emergence of big data and the tools for evaluating such information have dramatically lowered the cost of undertaking distributional analyses. There are large datasets on individual behavior and economic activity. Sources include smartphones, IRS tax data, and the JPMorgan Chase Institute. Other sources include administrative records on health expenditures and outcomes, such as Medicare; air pollutant concentrations, such as census-tract-level measures from satellites; and plant-level data, such as the Annual Survey of Manufactures. Government agencies now have the opportunity to characterize the distributional impacts of environmental and energy regulations across socio-demographic categories, industries, and regions.

Second, the accounting for the timing of benefits and costs of regulations requires the application of a discount rate. These annual percentages are intended to reflect how individuals reveal their preferences concerning the value of consumption today versus consumption in the future (3 percent), or the return an individual would expect for undertaking an investment (7 percent). Recent scholarship suggests that a lower discount rate would be appropriate. These rates were set in 2003 under the George W. Bush administration. Recent empirical work indicates that the 3 and 7 percent values are too high given more recent behavior individuals reveal in their investment and consumption behavior.

Moreover, uncertainty about long-term discount rates — for example over a century or more, relevant in the context of climate change — would better be represented by a lower discount rate or a discount rate that declines over time. As evident in previous work on the social cost of carbon, reducing the discount rate from 3 percent to 2.5 percent would increase the monetized benefits of reducing carbon dioxide emissions by about 50 percent.

Third, the evaluation of regulatory performance could significantly improve our understanding of the impacts of rules in practice and provide the evidence to inform public debates about the need for future regulatory actions. Over the past two decades, a rich literature has evolved that estimates in a rigorous, causal manner — as opposed to statistical associations — the impacts of energy and environmental regulations. The insights about how to approach ex post review, including the statistical framework and data needs, can be drawn from academic research to inform planning for regulatory performance evaluations. Designing and integrating such ex post reviews at the regulatory development stage — and subjecting such evaluation plans to both public comment and inter-agency review — can ensure that policymakers and the public will learn of the impacts of rules, including their net social benefits and the distribution of their impacts, under full implementation.

Evaluating and making public the economic impacts of energy and environmental rules — their net social benefits and the distribution of benefits and costs — can inform policymakers and build trust among the public in policymakers’ decisions. Such transparency can mitigate the influence of special interests on regulatory decision-making, while demonstrating how the government’s actions represent sound investments on behalf of the public.

Modernizing Regulatory Review for Energy, Environmental Policy.

To “Build Back Better,” Biden Must Undo Trump’s Obstruction Legacy
Author
David P. Clarke
Current Issue
Issue
2
David P. Clarke

Given his stated commitment to an aggressive climate agenda as a top priority, it came as no surprise that on his first day in office President Joe Biden signed an executive order to rejoin the Paris Agreement on climate change.

But now the hard work begins of rebuilding credibility and crafting what experts at a Columbia University webinar said must be a realistic plan to decarbonize the U.S. economy by 2050 to avert dangerous atmospheric warming.

During the January 19 online event, Tufts University professor Kelly Sims Gallagher said the Biden administration’s first step in preparing to participate in the upcoming November meeting of the climate convention parties in Glasgow must be the adoption of a robust domestic climate policy. Under Trump, the United States failed to double its investment in clean energy research and development, didn’t meet its pledge to the Green Climate Fund, and isn’t on track to achieve its 2025 climate targets, she said.

The new administration will also have to review the legacy of Trump’s regulatory rollbacks that included weakening more than 125 environmental rules and erecting barriers to further actions. Biden’s administration must advance its environmental agenda while undoing much if not all of Trump’s deregulatory broadside.

Jump starting the process, Biden chief of staff Ronald Klain in a January 20 memorandum to heads of federal agencies called for a “regulatory freeze pending review.” Biden’s appointees must have an opportunity to review and approve new and pending rules to determine whether they raise “substantial questions of fact, law, or policy.” The outcome of that review should be well worth watching.

If agencies want it, there’s no shortage of advice. The Wilderness Society highlighted Trump’s auctioning off the Arctic National Wildlife to drillers, though lease sales drew only three bidders. But in Trump’s closing months oil companies also applied for over 3,000 drilling permits on western public lands, according to the Bureau of Land Management.

Two environmental groups in a “First Things to Fix” report listed five immediate priorities, including rejoining Paris, stricter automobile emissions standards, far-reaching limitations on National Environmental Policy Act environmental impact assessments — the list goes on and on.

Although every administration adopts 11th hour rules to cement its agenda, Trump has “ushered in an unusually large number of energy and environmental policies,” according to a Washington Post analysis, including more than two dozen since losing the election in November.

Among Trump’s closing actions, EPA’s January 6 Strengthening Transparency in Regulatory Science Rule qualifies as particularly egregious, an official with an environmental group said privately. It applies to “the majority of what EPA does” — water, climate change, toxic chemicals, and other protections,

Small wonder that the Environmental Defense Fund and other plaintiffs on January 11 filed a lawsuit asking the court to declare EPA’s rule unlawful and enjoin its enforcement until at least 30 days after Federal Register publication. Describing the regulation as an internal “housekeeping rule,” EPA leaders made it effective immediately.

But, the rule’s “entire purpose” is directed at constraining EPA’s “discretion to consider scientific research” when underlying data are not publicly available. Data underlying human studies that are critical for developing health standards are barred for legal and ethical reasons from public disclosure. The rule is a “substantive restriction,” not mere “housekeeping,” the plaintiffs argue.

But EPA’s Science Advisory Board in its final comment letter recognized the rule’s importance and said in some cases the rule could “reduce scientific integrity,” and on February 1 a U.S. District Court accepted a Biden administration request to vacate the regulation.

And it’s not just 11th-hour rules Biden’s team will have to review. In January 2020 Trump’s DOE published a “process rule” that Biden “will have to unwind” because it hampers upgrading energy efficiency standards for dishwashers, refrigerators, and other domestic and commercial appliances and building equipment, says Joanna Mauer, a senior researcher with the American Council for an Energy-Efficient Economy. As a result of the Trump DOE’s failure to meet more than 20 deadlines for updating efficiency standards, years of carbon emission reductions have been foregone, and economically justifying upgrades are now more complex and slower.

As Trump left the White House to the sound of the military band he had demanded, it was clear at least to some degree the new president will be deflected from building his future-facing agenda by having to undo much of Trump’s backward-moving deregulatory measures.

To “Build Back Better,” Biden Must Undo Trump’s Obstruction Legacy.

An Insightful History, 40 Valuable Prescriptions
Author
G. Tracy Mehan III - American Water Works Association
American Water Works Association
Current Issue
Issue
2

No one will miss 2020, but two books deserve mention as we bid farewell to that annus horribilis. The Yale Environmental Dialogue, under the leadership of professor Daniel C. Esty, pulled together a cavalcade of experts from every discipline and field imaginable, and from varied political perspectives, to produce a comprehensive collection of essays on every conceivable topic relating to sustainability — ecology, environmental justice, Big Data, public health, land protection, agriculture, economics, urban policy and, very prominently, climate change, all with an emphasis on actionable recommendations.

Edited by Professor Esty, A Better Planet: 40 Big Ideas for a Sustainable Future features contributions by such luminaries as Nobel Prize-winning economist William Nordhaus; Jane Lubchenco, former director of NOAA; Thomas Lovejoy, the “father of biodiversity”; and Susan Biniaz, the former lead climate lawyer for the State Department, who helped negotiate the Paris Agreement. Your reviewer was honored to contribute an essay on water reuse (“Found Water: Reuse and the Deconstruction of ‘Wastewater’”).

Indy Burke, Dean of Yale’s School of Forestry & Environmental Studies, has described the urgent need for seeking common ground amidst current “political division and deep disagreements over core principles” in order to meet contemporary environmental challenges. Says Dean Burke, “We have to do the hard work of bridging these divides.” That is the rationale behind Esty’s Yale Environmental Dialogue and the publication of A Better Planet.

While the Yale project is exciting, forward-looking, and innovative, William and Rosemarie Alley seek to document the historic successes of and current challenges to EPA, the world’s premier environmental agency. In The War on the EPA: America’s Endangered Environmental Protections, the authors write that “in point of fact, never in the EPA’s history has there been a time when anything was simple.” In other words, EPA has always been engulfed in controversy and, given the nature of its role as regulator, caught in a perpetual cross-fire between environmentalists and various regulated sectors in an endless round of regulation, litigation, re-regulation, and legislative interventions. “Virtually everything that the EPA has accomplished has come out of the crucible of intense controversy,” observe the authors. “Even in the best of times, it’s remarkable that anything gets done.”

The Alleys have written a well-researched, articulate, and wide-ranging survey of environmental issues spanning the entire history of the agency. They combine William’s scientific expertise (he was chief of the Office of Groundwater for the U.S. Geological Survey) with Rosemarie’s professional writing skills to offer the reader a very fine and fluid narrative through technically and legally dense subject matter. It would be great supplemental material for an environmental policy or law course. Lawyers looking for a broader perspective, beyond their specialty, and a brief history of environmental regulations and the battles over same, would also benefit.

The Alleys manage to say something interesting on a long list of topics: wastewater and drinking water issues, Superfund and the Resource Conservation and Recovery Act, clean air issues and New Source Review, Waters of the United States, unregulated agricultural nonpoint-source pollution (“a wicked problem”), toxic chemicals, the Clean Power Plan, “secret science,” you name it. Their overview of the complex interaction between geology, groundwater, and toxic chemicals in the environment — along with a succinct description of the evolution from “pump and treat,” ad infinitum, to remediate contaminated groundwater to a more effective bioremediation and in situ treatment — is informed yet intelligible to the non-specialist reader.

It is no criticism to say that they have also written a polemic targeting, in order, President Trump, his former EPA administrators Scott Pruitt and Andrew Wheeler, and several Republican presidents and members of Congress, including but not limited to Ronald Reagan, Newt Gingrich, and George W. Bush. They do not entertain substantive counter-arguments to EPA’s positions or take them seriously. There are good guys and bad guys, period. Still, one may not agree with the polemic but recognize its power and efficacy in making a policy or political point. Cicero and Augustine would approve. But the reader should be forewarned.

The authors lament that science, EPA, and environmental regulation are embroiled in controversy and even disfavored in many quarters. They note the budgetary pressures the agency has experienced over many years, especially the lack of support in the Trump administration. (Congress declined the more extreme cuts.) They want to re-invigorate EPA’s regulatory agenda and deal with a variety of issues: toxic chemicals, “forever chemicals,” climate change, and a moribund Superfund program.

“The long arduous course of scientific study requires considerable time and patience. . . . For Superfund (and other EPA programs) to be effective, the agency not only needs good scientists and lawyers, but also good communicators, listeners, and decisionmakers with high ethical standards,” claim the Alleys. “To accomplish all this, the bottom line is that EPA needs adequate funding and a favorable work environment to attract a capable and committed work force.”

The authors of The War on the EPA describe a daunting set of circumstances having as much to do with the American public’s current skepticism about the federal government as much as the agency. According to the Pew Research Center, “During the . . . George W. Bush administration and the presidencies of Barack Obama and Donald Trump, the share of Americans who say they trust the [federal] government just about always or most of the time has been below 30 percent. Today, 20 percent say they trust the government.” Additionally, “While the share of Republicans who trust the government has increased during Trump’s time as president, only 28 percent say they trust the government, compared with 12 percent of Democrats.”

Different constituencies distrust the feds for different reasons. But this is in stark contrast to 1958, the first year Pew enquired on the matter, and 73 percent expressed trust in the federal government. This is a fundamental shift in the American psyche, and EPA is collateral damage.

This distrust is aggravated by political polarization. As reported by Max Rust and Randy Yeip in The Wall Street Journal (“How Politics Has Pulled the Country in Different Directions,” November 10, 2020), “If it feels like Republicans and Democrats are living in different worlds, it’s because they are.” Rust and Yeip say, “There are few places left in America where one tribe of voters is likely to encounter the other.”

What, if anything, can be done about this state of affairs, at least as it relates to environmental policy? Returning to A Better Planet, Daniel Esty’s essay “Red Lights to Green Lights: Toward an Innovation-Oriented Sustainability Strategy” may be helpful. While recognizing the undoubted success of the command-and-control regulatory strategies of the 1970s and 1980s — red lights for polluters — that “framework has proven to be incomplete. It has failed to offer signals as to what society needs businesses to do, including what problems to solve, what research and development to undertake, and what investments to make.”

Moreover, the original paradigm came at a price. It was slow and inefficient “insofar as the government does almost all of the environmental work.” Indeed, “This over-reliance on government as the central (and often sole) actor also leads to high costs, avoidable inefficiencies, constant litigation over standards, and disincentives for innovation,” argues Esty.

The old approach did not spur transformative change or engage the business community and financial markets as problem solvers. The red-light model does not drive entrepreneurial zeal.

What is needed to address contemporary challenges is “a systematically designed structure of incentives to encourage innovation and problem solving. In short, we need to complement our system of red lights with an expanded set of green lights,” writes Esty. This entails adoption of the polluter-pays principle and the “end to externalities,” i.e., “those who inflict environmental harms on society must pay for them.” Polluters need to be charged for their emissions or other negative impacts. Such “harm charges” would send price signals for the need to remake products or production processes.

Just as those generating negative externalities should pay, those generating positive externalities, or benefits to society, should be compensated, e.g., private landowners whose property provides habitat for endangered species.

Esty’s idea is not new, but it needs to be recalled and taken to heart by policymakers.

With a new administration taking over the executive branch, will Congress be able to come to grips with an environmental statutory regime almost a half century old and provide EPA and other agencies with the tools they need to turn red lights into green? We hope for the best.

An Insightful History, 40 Valuable Prescriptions.

Preparing for Climate Disasters
Author
David J. Hayes - NYU School of Law
Jessica Grannis - National Audubon Society
Sarah Greenberger - National Audubon Society
NYU School of Law
National Audubon Society
National Audubon Society
Issue
1
Preparing for Climate Disasters

The United States knows how to respond to disasters. When calamity strikes, the Federal Emergency Management Agency swings into action, opens the national coffers, and leans on state governments to deliver relief supplies to affected communities. The execution occasionally falters, but the playbook is familiar and, by and large, sound.

The dramatic increase in the frequency and severity of natural disasters, however, requires writing a new chapter in the playbook. Climate change has converted what were formerly 100-year and 500-year storms and floods into common events, triggering fiscally irresponsible repeat spending on disaster after disaster. The United States is beginning to acknowledge this new reality and chart a path toward more deliberate preparation for climate events by engaging in pre-disaster planning, and investing in resilient infrastructure that can adsorb big hits — saving money, life, and limb in the process.

Congress has recognized that fiscal prudence demands this result. With studies showing that every dollar spent on hazard mitigation saves six dollars in future disaster costs, Congress has quietly been accompanying billion-dollar post-disaster relief appropriations with more limited (but not insignificant) pre-disaster mitigation funding. Then, in a breakthrough Congress in 2018 enacted the Disaster Recovery Reform Act, which anticipates that FEMA will set aside up to six percent of all money appropriated to disaster relief to support investments in pre-disaster planning and infrastructure. This new program, called Building Resilient Infrastructure and Communities, appropriately known as BRIC, is just gearing up now. When fully launched, the program will direct billions of dollars toward making our communities more resilient and more capable of responding to disastrous storms and floods.

While this new program is necessary and appropriate, the United States is not set up to maximize its beneficial impact. Unless it provides much more significant planning and execution assistance than is now available, FEMA could transfer billions in ill-conceived hazard mitigation grants to states, tribes, local communities, and territories, representing a massive lost opportunity.

The incoming Biden administration must not allow this to happen. We know what can go wrong. FEMA and other first responders who are expert in implementing post-disaster efforts are not as skilled in identifying and evaluating long-term resilience solutions. Too often, pre-disaster hazard mitigation money flows to familiar, off-the-shelf engineered projects — whether they represent the best long-term solutions or not. Traditional economic tools ignore or under count ecosystem benefits, community preferences, and other less easily monetized benefits. And frontline communities that face disproportionate risks from disaster events may not even be consulted and, if they are, have limited resources or capacity to participate in the decisionmaking process.

Unless addressed, these shortcomings will haunt FEMA’s new program. Without an organized repository of information on existing threats and emerging best practices, a regularized structure that solicits and evaluates promising investment alternatives, and systematic follow-up that tests and records whether resilience projects provide promised benefits, billion-dollar mistakes will continue to be made.

For vulnerable coastal areas, these missing programmatic elements mean that gray infrastructure projects like sea walls and other armoring techniques will continue to scarf up a disproportionate share of resilience spending. FEMA already has demonstrated its propensity to be stuck in the do-loop of rebuilding communities in flood zones again and again — particularly after the Trump administration rescinded the Federal Flood Risk Management Standards that would have required communities to consider future flood risk when rebuilding.

This article reviews these issues and offers recommendations for how to maximize the effective disbursement of the billions that FEMA and other federal agencies will be spending to improve the resilience of vulnerable coastal resources and infrastructure. We focus on investments in coastal infrastructure because we know that under its new BRIC program, FEMA will be providing large pre-disaster mitigation grants to states that are being hammered by sea-level rise and climate-infused mega storms. Also, in the aftermath of recent hurricanes and the Deepwater Horizon oil spill, a large number of coastal resilience projects already are underway along the Eastern Seaboard and in the Gulf of Mexico, providing a rich source of new data and experiential learning that can and should inform the future direction of FEMA’s new grant program and other pre-disaster resilience spending.

After Hurricane Sandy, for example, the Department of Housing and Urban Development led the innovative National Disaster Resilience and Rebuild by Design competitions, while the Department of the Interior and Fish and Wildlife Service invested more than $300 million in coastal resilience projects across the Northeast and Mid-Atlantic. Likewise, after Hurricane Harvey, Congress provided over $28 billion to HUD to support recovery in lower-income communities, and set aside $12 billion to help these neighborhoods mitigate risks from future disasters. The National Oceanic and Atmospheric Administration has ramped up its National Oceans and Coastal Security Fund to invest in myriad restoration projects following other hurricane disasters. And the Deepwater Horizon oil spill disaster spawned settlements that have allocated a jaw-dropping $16 billion toward coastal and ocean-related environmental and economic restoration activities in the gulf.

Although our focus is on coastal investments, the principles identified in our article have equal bearing in other disaster-prone contexts, including wildfires, inland flooding, tornados, and other extreme weather events made more common by climate change.

The year 2020 was a record-breaking period of natural disasters, closing out two decades in which the United States has seen hundreds of billions in economic losses and thousands of fatalities. Warmer coastal waters and rising seas are increasing the intensity of weather events, driving storm surges further inland, and unleashing biblical-scale rain events in communities like Houston. And these disasters are disproportionately affecting low-income neighborhoods and communities of color. Economic inequality, lack of investment, and proximity to pollution all exacerbate threats from climate-related disasters, which was demonstrated this year with deadly consequence as a triad of storms successively hit Southwest Louisiana — a region that is already overburdened with pollution due to its prevalence of heavy industry. Scientists predict that more devastation is in store as the planet continues to warm, estimating that without action the United States stands to lose nine percent of its GDP by 2060 — equal to the economic losses recorded during the COVID-19 pandemic, but repeated year after year.

These statistics expose the need for new approaches to how communities recover from and rebuild after disasters. We know that because of the impetus to quickly “get things back to normal,” federal funders and grantees make numerous mistakes, including often turning to outdated, environmentally harmful approaches to protect coastal communities and resources — like rebuilding in harm’s way and relying heavily on gray infrastructure such as levees and sea walls.

The challenge and opportunity provided by the upcoming flood of federal dollars toward pre-disaster hazard mitigation activities calls for a much more systematic and disciplined approach. FEMA and other federal funders should be pushing states and other governmental entities to engage in planning so they can effectively deploy resilience dollars as soon as they become available. They also should be encouraging their grantees to adopt a variety of additional reforms, including increased use of nature-based resilience strategies. Another measure is the development of metrics to measure and verify the effectiveness of resilience projects. They need to provide a clearinghouse and mapping services to help grantees understand their vulnerabilities and learn from the experience of others, while ensuring that all communities — especially disadvantaged ones — have a strong voice in deciding how hazard mitigation dollars are spent. Finally, agencies need to expand private-sector support for these strategies.

It is intuitively obvious that states that have thought ahead and developed sound plans to protect coastal resources will be better positioned to take advantage of FEMA’s BRIC grants and other funding opportunities. Experience backs up this supposition. Louisiana, for example, has developed a comprehensive Coastal Master Plan that provides a science-based frame for directing investments. The plan includes innovative nature-based strategies for enhancing coastal ecosystems and reducing flood and sea-level rise risks.

Louisiana has turned to its CMP to effectively direct funding to well-planned projects, including multiple million-dollar resilience projects that are being funded from the Deepwater Horizon oil spill settlement. For example, Louisiana is building sediment diversions that will redirect silt and sand being carried down the Mississippi River to rebuild protective marshes in areas like the Barataria Bay and Breton Sound. Other gulf states that did not have mature state plans in place had to scramble to come up with appropriate projects for Deepwater Horizon settlement funds.

Although states and local governments must develop hazard mitigation plans to receive FEMA funding, they should be encouraged to develop more comprehensive climate resilience plans — and governments with such plans should be prioritized for federal funding. Federal agencies should also provide technical assistance and guidance to help states and communities develop robust plans and design resilience projects that can be implemented quickly in the aftermath of a disaster. Such plans should consider future climate risks and identify strategies for building long-term resilience. Plans also should consider risks to both the natural and built environments and identify ways of preserving and restoring natural assets as a means of reducing risks and enhancing resilience for both people and wildlife. Needless to say, states with robust resilience plans will be better able to quickly deploy scarce resources and deliver more cost-effective projects that generate multiple benefits for vulnerable communities.

The Obama administration’s launch of a $1 billion National Disaster Resilience Competition after Hurricane Sandy provides evidence that supporting state planning and project design efforts can pay dividends. The Rockefeller Foundation brought in a range of experts who trained and helped state and local applicants develop innovative resilience projects that addressed multiple community challenges. In New Orleans, for example, the city is reintegrating nature, using green infrastructure approaches to improve stormwater management, reduce flooding, and create recreational amenities in underserved neighborhoods.

States should be encouraged to leverage other funding sources and take regulatory and incentive-based approaches to complement federal investments. FEMA and other federal agencies should consider using the carrot of federal funding to incentivize state and local governments to adopt proactive approaches. Some states already are putting in place new programs that consolidate disparate funding sources and operational capabilities to focus on resilience priorities. For example, South Carolina recently passed legislation to create an Office of Resilience to coordinate planning activities, and a Resilience Revolving Fund that will combine state and federal resources and support investments in floodplain buyouts and restoration efforts that reduce flood risks in communities. FEMA should use its BRIC program to reward states that put some of their own skin in the important resilience game.

Another important step is to advance nature-based approaches. FEMA and other federal agencies making coastal resilience grants should insist that states propose the deployment of natural infrastructure projects in concert with, or as an alternative to, gray infrastructure. The federal government should explicitly acknowledge and credit the additional benefits that typically accompany such nature-based solutions. By signaling its preference, the feds will encourage states to take such projects seriously.

To achieve this end, we need to remove outdated federal rules that present roadblocks to natural infrastructure approaches. Although FEMA recently changed its policy to allow for the inclusion of ecosystem service benefits in required benefit-cost analyses — with the specific goal of “allow[ing] for easier inclusion of nature-based solutions into risk-based mitigation projects” — that is easier said than done. Ecosystem service benefits are notoriously difficult to monetize. Modelling that is out-of-reach for most applicants is often needed to demonstrate the risk reduction benefits of nature-based projects. Even then, without more aggressive retooling of benefit-cost expectations and discount rates, federal agencies are likely to continue to overvalue the benefits of environmentally harmful short-term solutions like shoreline armoring while not fully accounting for the environmental and social benefits delivered by natural solutions.

One way to address this problem is for FEMA and other federal agencies to fund demonstration projects that test the efficacy of these approaches while providing case examples of monetized ecosystem service benefits, meanwhile overhauling benefit-cost rules that present roadblocks. For example, FEMA could set aside a significant percentage of its mitigation funds for natural infrastructure projects, similar to the green project reserve that requires 20 percent of water infrastructure funding for green stormwater management approaches. This would help demonstrate the effectiveness of these nature-based approaches for reducing flood risks, and also build capacities at all levels of government to design and implement natural infrastructure projects.

When tens of millions are being invested in improving coastal resilience from climate and other impacts, federal funders and local communities alike need to know that taxpayer dollars are being spent wisely. Accordingly, it is important to identify and estimate the social, economic, and environmental benefits that may be associated with various coastal resilience strategies and then to evaluate, over time, whether projects are delivering the expected benefits.

As noted above, this exercise can require the development of sophisticated methodologies that credit the “natural capital” benefits that may accrue from nature-based coastal solutions (such as improved water quality, fishery and avian benefits, blue carbon, etc.), as well as social and economic benefits that may be tied into community preferences and job opportunities. With the assistance of federal scientists, NGOs, academic centers, and grantees, FEMA and other federal agencies need to develop methodologies and metrics that can be used to measure and verify the full range of potential project benefits.

Significant attention also needs to be paid to monitoring project performance. FEMA’s soon-to-be well-funded BRIC program is well-positioned to provide money for the deployment of monitoring protocols for resilience projects. Other federal authorities engaged in approving resilience projects, like HUD, Interior, and NOAA, need to be doing the same. Monitoring data and adaptative management experience must then be made available for agencies at all levels to identify gaps, learn from experience, inform investment and reforms, and establish best practices that can be scaled and replicated.

While the federal government should not use its funding leverage to dictate state decisionmaking, the feds can, and should, use their unique vantage to disseminate information about successes and failures of resilience planning and execution practices all around the United States. This will help planners identify best practices and benchmark their projects.

The federal government can contribute this valuable service by building upon approaches like the U.S. Climate Resilience Toolkit and NOAA’s Digital Coast. The concept would be creating a comprehensive adaptation clearinghouse that catalogues and provides benefit-cost and performance data regarding hundreds of coastal protection (and other climate-related) resilience projects. Such a clearinghouse would include detailed information about projects in an accessible format that would enable users to efficiently gather and test out ideas that have been tried in different jurisdictions. Such a clearinghouse also would facilitate direct contacts with state sponsors and project managers, obviating the frustrating sense that jurisdictions are being left to invent the wheel on their own.

As a helpful step for affected agencies and communities, the feds should augment a climate adaptation clearinghouse with a centralized GIS-based mapping service that pulls together data so that states and other interested governments and individuals can evaluate the vulnerability of their communities as well as explore resources in their regions that can reduce climate-related impacts.

In doing mitigation right, it is vital that communities have a voice in decisionmaking. Failures to meaningfully engage with neighborhoods regarding pre-disaster planning and post-disaster responses have led to disparities in recovery outcomes for low-income communities and communities of color. Research has shown that communities of color often never fully rebound after experiencing impacts from extreme weather and are often left worse off than their White neighbors, who have an easier time accessing aid and receive higher disaster payments.

FEMA and other federal agencies involved in funding hazard mitigation planning have a responsibility to provide the resources that communities need to fully consider socioeconomic vulnerabilities and to direct investments to frontline communities that face the greatest threats from climate impacts. To ensure that marginalized communities have a voice in decisionmaking, agencies should provide sustained funding for community-based organizations that can lead planning processes and support meaningful engagement between government decisionmakers and residents.

Louisiana’s Strategic Adaptations for Future Environments initiative provides an example of this type of effort. LA-SAFE is implementing a range of innovative coastal resilience projects that were selected after an extensive, year-long community engagement process that was co-designed and directed by residents of six coastal parishes that were hard hit by Hurricane Isaac in 2012. These community-driven projects were only made possible because of significant funding from HUD and the support of regional philanthropy and academic and nonprofit partners.

Thus, FEMA’s BRIC program and other federal sponsors should provide funding that will enable underserved communities to participate fully in hazard mitigation planning and implementation activities. More generally, Congress should consider increasing the federal cost share for mitigation and resilience investments in economically underserved communities to ensure that resources are being directed to the most at-risk communities. Economically distressed and tribal communities often struggle to raise the needed match, which limits their ability to leverage federal funds to support important investments. While FEMA mitigation programs offer a 90 percent federal cost share for smaller rural communities, larger economically distressed communities still must raise a 25 percent match. Congress should extend more favorable federal cost share to economically disadvantaged communities of any size.

Siloed decisionmaking at the federal level also is limiting the ability of states and communities to identify and implement holistic projects that comprehensively address community challenges. By helping to coordinate funding, permitting, and environmental reviews, federal agencies can support better projects on faster timelines.

Experience confirms that this can be done. After Hurricane Sandy, technical coordinating teams were established to improve coordination among federal agencies administering disaster relief funds and their state counterparts. These teams helped align funding to support a more comprehensive disaster recovery in affected communities. Teams also were established to coordinate permitting across federal agencies and to work with project leads to help them navigate rules and more quickly advance projects. A similar model is being employed in the San Francisco Bay region to coordinate state and federal permitting agencies and speed implementation of restoration and resilience projects that are being funded through Measure AA — a regional parcel tax that is generating $25 million annually to support investments in ecological restoration around the San Francisco Bay.

State and local governments cannot do this work alone. The private sector — including philanthropy, nonprofits, academia, and businesses — also have important roles to play in supporting and enhancing public-sector efforts to build climate resilience. The federal government should support and encourage the private sector to dedicate resources and talent to collective efforts to address the climate crisis.

The Rockefeller Foundation’s collaboration with HUD and its support for the National Disaster Resilience Competition is one model. Conservation land trusts — like Katy Prairie Conservancy in the Houston region — are also supporting flood resilience initiatives by acquiring flood-prone properties and restoring natural ecosystem functions. And nonprofits like Audubon are working with community-based partners to provide technical support and assistance to help neighborhoods design natural infrastructure that enhances climate resilience for both people and wildlife. For example, Audubon California is working with Shore Up Marin City — a multi-racial environmental coalition in a lower-income community in Marin County — to design and restore tidal wetlands that will reduce flood risks and also provide other environmental and recreational amenities in an underserved community. Federal programs should remove barriers to and create incentives for the private sector to support state and local resilience efforts in these ways.

With significant new resources flowing to efforts to reduce risks before disasters strike, the Biden administration has an important opportunity to ensure that state and local governments have the funding and capacities needed to create effective projects that will meet multiple community needs. By implementing the common-sense approaches above, the Biden administration can ensure that communities have the tools and resources needed to build a better future. TEF

LEAD FEATURE With new funding to reduce risks before calamity strikes, the Biden administration has an opportunity to ensure that state and local governments have the resources and capacities needed for effective mitigation projects that will meet multiple community needs.