Alternative Methods Weaken Protection
Author
Jennifer Sass - National Resources Defense Council
National Resources Defense Council
Current Issue
Issue
2
Parent Article
Jen Sass

Federal agencies have invested heavily in the development of “new approach methodologies”—NAMs for short—to help set exposure limits, health advisories, and cleanup policies for toxic chemicals in air, water, soil, and consumer products. The 2016 amendments to TSCA direct EPA to encourage the “use of scientifically valid test methods and strategies that reduce or replace the use of vertebrate animals.” This is where NAMs come into play.

But such novel assays must “provide[] information of equivalent or better scientific quality and relevance that will support regulatory decisions under this title.” However, except for a limited number of acute toxicity endpoints, most NAMs remain unvalidated for determining important health effects and have not yet met the TSCA criteria for reducing or replacing animal studies.

Public interest groups have serious concerns with EPA’s shift from reliable rodent toxicology methods to unproven NAMs, and with the agency’s failure to have clear policies for following scientific best practices for regulatory use of these methods.

Rodent studies yield an exceptional return on cost by providing robust information to inform risk assessment and prevent future harm. Established policies and guidance, which have undergone extensive peer review and public comment, support the use of these tests. EPA’s NAMs Work Plan notes, “The scientific confidence associated with the traditional toxicity tests comes from the decades of experience in their development and application.”

So EPA needs a clear, legally defensible framework laying out how it will use NAMs, one that meets scientific best practices, accounts for data gaps and uncertainties, and supports replicable evaluations. But we have some questions. How will the agency use NAMs to find health risks? How many positive test results should support a toxicity finding? Will tests count equally or be weighted? What about marginal results?

The agency’s recent proposal to weaken its hazard evaluation of acephate—a pesticide derived from nerve gas—illustrates alarming flaws with its use of NAMs. Based on mixed positive and negative indicators in the tests, the agency proposes to eliminate the default child protection factor mandated by the Food Quality Protection Act, and rule out developmental neurotoxicity. This will surely increase harmful exposures to acephate, previously determined by EPA to be too toxic to use.

Yet, NAMs often yield false negatives and exclude various factors affecting toxicity, including metabolites, microbiomes, epigenetic effects, and hormonal effects. EPA’s Children’s Health Protection Advisory Committee and international science policy experts, including at the OECD, have cautioned against using NAMs to weaken protections.

EPA must ensure its use of NAMs will provide adequate and reliable data for chemical hazard assessments and achieve the same or greater health protection as rodent studies. The International Agency for Research on Cancer is a good model for integrating mechanistic information into its chemical evaluations, not using it to rule out hazards or weaken health protections.

EPA’s use of NAMs only to exclude, not to affirm, hazards means chemicals triggering concerns would still require more testing, including on animals. If the agency is serious about reducing animal testing, it could regulate to restrict toxic chemicals and advance health protective policies with existing information. For example, it could restrict classes of related chemicals instead of single chemicals, use adjustment factors for data gaps and potential false negatives, and eliminate unnecessary uses of hazardous chemicals.

At present, discussions of NAMs are limited to a narrow set of stakeholders, sidelining the public and impacted communities from the important debate over how to assess and regulate chemicals. Yet, the continued prevalence of dangerous pollution is rooted in policies that reflect systemic racism and classism. Addressing these realities when evaluating NAMs and other tools for determining chemical risks is essential for effective public health-centered policies.

Regulatory Toxicology 2.0
Author
Sherman McFarland - Johns Hopkins University Bloomberg School of Public Health
Rebecca Critser - Johns Hopkins University Bloomberg School of Public Health
Paul Locke - Johns Hopkins University Bloomberg School of Public Health
Johns Hopkins University Bloomberg School of Public Health
Johns Hopkins University Bloomberg School of Public Health
Johns Hopkins University Bloomberg School of Public Health
Current Issue
Issue
2
Lab Report

Scientific information about the toxicity of chemicals and radiation plays a central role in environmental protection. Key statutes that protect human health and ecology—the Clean Air Act, the Clean Water Act, the Safe Drinking Water Act, the Toxic Substances Control Act, the Federal Food, Drug, and Cosmetics Act, and the Federal Insecticide, Fungicide, and Rodenticide Act—demand scientifically credible data on the toxicity of pollutants and commercial products. These data are used for both setting regulatory standards and backing enforcement actions. Federal agencies’ ability to protect humans and the environment are dependent on this one-two combination. Laws and their implementing regulations provide authority to control hazards, and science provides the evidence about when to act.

Regulators usually rely on toxicity studies, especially those using animals, for this information. Many of these studies use rodents (purpose-bred mice and rats), but other species are also employed, including dogs, cats, rabbits, guinea pigs, hamsters, birds, fish, and non-human primates such as macaques. In particular, TSCA and FIFRA are data hungry, and many animal studies are conducted to satisfy their regulatory needs.

Toxicity studies using animals and the field of regulatory toxicology first came into widespread use in the early 1940s, when consumer and medical products led to several tragedies involving death or disfigurement. One of those products was an antibacterial drug designed for children called elixir of sulfanilamide. To enhance ease of administration, the manufacturer added a raspberry flavor to, and prepared a liquid version of, the drug. Because it is difficult to get sulfanilamide into liquid form, it was dissolved using 70 percent diethylene glycol, which is poisonous. More than 100 children lost their lives. A consumer product called Lash Lure also led to loss of life and blinding. Lash Lure was an aniline-based compound that was marketed to beauty parlors as a dye for eyebrows and eye lashes. When applied to human skin, aniline compounds can cause severe allergic responses, and at least one death and several cases of blindness were reported due to Lash Lure.

The loss of life and health for which these products were responsible led Congress to pass laws that required pre-market testing. The scientific community rose to the challenge and developed tests that could be used to evaluate drugs and consumer products. Later in the 20th century, when Congress began to pass environmental laws, legislators demanded pre-market testing of some new compounds (such as pesticides in FIFRA) and granted EPA the authority to require post-market testing of some compounds already on the market (such as in TSCA). This was regulatory toxicology 1.0.

Today, the need to test environmental chemicals for hazard and risk is much greater. A more nuanced and sophisticated view has emerged about how humans are impacted by the environment. Thanks in large part to the above laws, there are few, if any, immediate deaths or illnesses caused by exposure to environmental chemicals and radiation; however, there is strong evidence that chronic exposure to less-than-lethal levels of environmental chemicals damages health and diminishes quality of life. In addition, studies in human populations have shown that a combination of environmental exposures, plus lifestyle choices and factors such as active smoking, poor diet, stress, legal and illicit drug intake, can combine to degrade well-being and welfare.

Consequently, more data is needed on a greater number of chemicals and over a broader range of health endpoints. While many environmental risk assessments are focused on cancer, there is an increasing recognition that neurological endpoints such as Parkinson’s and Alzheimer’s disease, as well as cardiovascular diseases, are associated with environmental factors. Cumulative exposures to several contaminants when combined also create health problems that need to be studied. Unfortunately, modern data needs far outstrip data production. The current system of regulatory toxicology and related testing must evolve if it is to meet demand. This evolution must start with a fresh way of addressing data gaps and needs, acknowledging that a transition must be made, and charting a course for that evolution. To demonstrate progress made and the challenges that confront us, it is useful to examine in detail two studies from the National Academy of Sciences, Engineering, and Medicine published 16 years apart.

NASEM studies are frequently commissioned by federal agencies to shed light on difficult scientific issues. The study committees bring together experts who review the literature, take testimony from other experts, and prepare a consensus report. Of the two reports discussed below, the first was commissioned by EPA and the second was commissioned by the National Institutes of Health at the direction of Congress.

As amended in 2016, TSCA is the primary U.S. law that regulates chemicals in commerce. One of the major criticisms of the original TSCA was that it was not effective at generating the toxicological knowledge that EPA, citizens, and the business community needed to make decisions about the hazards of chemicals commonly manufactured. It has been estimated that of the approximately 85,000 chemicals in commerce, we have solid toxicological information on fewer than 1,000.

In 2007, in part to address this toxics ignorance gap, NASEM prepared the first of the two reports, entitled “Toxicity Testing in the 21st Century: A Vision and a Strategy.” The report, usually known for short as TT21C, is a detailed critique of the current animal-centered toxicity testing paradigm. It sets out a plan for how the toxicity testing system should change to meet data and decisionmaking needs. It recommends that the current testing paradigm be changed in a way that allows for the development of improved predictive, human biology-based science that supersedes what can be gleaned from animal studies.

TT21C reached several conclusions. First, the report pointed out that animal studies are time consuming and expensive. Second, it noted that animal studies are not always predictive of human responses. Third, looking ahead, it called for toxicity testing for regulatory purposes to be built around the use of in vitro systems that use human cells and tissues. Last, it outlined how and why systems biology—a scientific approach that integrates information from cells, tissues, organs, and population studies—and pathways of toxicity will provide better science for regulatory decisionmaking.

Since the report’s release, EPA has expended considerable effort to make its vision and strategy a reality. These efforts have been spurred, in part, by changes in the law, including provisions in the amended TSCA that explicitly address new methods and encourage the use of non-animal alternatives when scientifically appropriate. New TSCA also requires the agency to publish a work plan to reduce the use of vertebrate animal testing and to increase the use of what the agency calls “new approach methodologies,” or NAMs, which it defines as technologies, methods, approaches, or combinations of these techniques that can provide information on chemical hazard and risk assessment to avoid the use of live-animal testing. The EPA also publishes a list of available NAMs. At the time this article was written that list contained about 30 tests across a variety of endpoints such as skin and eye damage, and a few for endocrine disruption. These methods continue to be developed and this list will expand.

The TT21C report, the changes incorporated into TSCA in 2016, and new in vitro and in silico technologies are catalysts to increase the pace of change in the practice of regulatory toxicology. A 2020 review paper by Daniel Krewski, the former chair of the TT21C committee, and several committee members concluded that “overall, progress on the 20-year transition plan laid out . . . in 2007 has been substantial. Importantly, government agencies within the United States and internationally are beginning to incorporate the new approach methodologies envisaged in the original TT21C vision into regulatory practice.” 

The 2016 TSCA rewrite contains provisions that specifically encourage the use of non-animal alternatives in place of vertebrate animals; in conjunction with improvements in scientific technique, these developments have led to progress. Because of TT21C and the provisions in the amended TSCA law, EPA has a good start replacing vertebrate animals in its risk assessments. However, as the next NASEM study demonstrates, considerable work remains before we can replace animals completely.

Since the beginning of the COVID-19 pandemic, the scientific community has complained that there is a shortage of non-human primates, or NHPs, for research. The United States uses about 70,000 NHPs per year for brain, infectious diseases, and aging studies. China stopped exporting NHPs in 2020, which contributed to a shortage in the United States, which at that time, imported 60 percent of its NHPs from China. In particular, the United States experienced a 20 percent drop in imports of long-tailed macaques, a species used by private industry for drug and vaccine research. The second of the two NASEM reports, this one commissioned by the National Institutes of Health, was published in 2023. “Nonhuman Primates in Biomedical Research: State of the Science and Future Needs,” points out that “although nonhuman primates (NHPs) represent a small proportion—an estimated one-half of 1 percent—of the animals used in biomedical research, they remain important animal models due to their similarities to humans with respect to genetic makeup, anatomy, physiology, and behavior. Remarkable biomedical breakthroughs, including successful treatments for Parkinson’s and sickle cell disease, drugs to prevent transplant rejection, and vaccines for numerous public health threats, have been enabled by research using NHP models. However, a worsening shortage of NHPs, exacerbated by the COVID-19 pandemic and recent restrictions on their exportation and transportation, has had negative impacts on biomedical research necessary for both public health and national security.”

On November 16, 2022, the Department of Justice announced an indictment of eight people for smuggling long-tailed macaques into the United States, and for conspiracy to violate the Lacey Act and Endangered Species Act. Two of those who were indicted for smuggling and conspiracy were officials of the Cambodian Forestry Administration. Long-tailed macaques are an endangered species. The indicted persons allegedly removed wild, long-tailed macaques from national parks and other protected areas in Cambodia, then took the macaques to breeding facilities where they were provided fraudulent export permits, which falsely stated that the macaques were bred in captivity. As a result, Cambodia stopped exporting NHPs, which is expected to significantly impair drug development in the United States. In fact, as of September 2022, Cambodia accounted for 60 percent of American imports of NHPs. Additionally, Inotiv, the United States’ largest commercial monkey dealer, decided to halt sales of all Cambodian NHPs in its possession in the United States. Interestingly, in a report filed with the Securities and Exchange Commission, Inotiv stated that, following January 13, 2023, it “has shipped a select number of its Cambodian NHP inventory; however, [Inotiv] is not currently shipping Cambodian NHPs at the same volumes that it was prior to the [indictment].” In that same report, Inotiv expressed that it “expects to establish new procedures before it will resume Cambodian NHP imports.”

Further contributing to the NHP research shortage is the issue of transportation. Since 2000, an increasing number of airlines have refused to transport NHPs for use in research. This action was taken primarily in response to concerns by the public. In 2018, the National Association of Biomedical Research filed a complaint against the airlines with the Department of Transportation. As of 2022, DOT has not made a ruling on this case and many believe that the department has issued what amounts to a rejection of the complaint due to their prolonged silence. Consequently, available NHPs for research in the United States have declined drastically in recent years. Considering the airline transport factor along with the reality that Chinese and Cambodian NHPs are unavailable to the United States, it is important that the federal government treat the shortage as an opportunity to advocate for and dramatically increase funding for non-animal test methods. Funding directed this way will enable the United States to replace NHPs more rapidly in research and drug development.

To answer questions about the need for, and value of, NHPs in research, Congress directed NIH to sponsor a NASEM study. The 2023 report explores the state of biomedical research using NHPs and their future roles in NIH-supported research. It also assessed the research and development status of new approach methodologies, such as in vitro and in silico models, and their potential role in reducing the use of NHPs.

The new report reached the following conclusions. Research using NHPs has contributed to numerous public health advances; continued research using NHPs is vital to the nation’s ability to respond to public health emergencies. More NHPs will be needed in the future. The shortage has gotten worse, and without financial and other support, the ability of the NIH to respond to public health emergencies will be severely limited. Based on the current state of NAMs, the previously mentioned new approach methodologies, there are no alternative approaches that can totally replace NHPs, although there are select NAMs that can replicate certain complex biological functions, and it is reasonable to be optimistic that in future years it might be possible to replace NHPs. To reduce reliance on the animals, additional resources are needed, along with a collaborative effort among those developing NAMs and those who currently utilize NHPs, and a plan and process to validate these new NAMs.

This report does endorse a strategy to acquire more NHPs for scientific research. While that is where much of the news reporting has focused, there are important lessons about the need to build NAMs that can replace the use of NHPs. The report makes it clear that NIH and researchers should not view the shortage of NHPs available for research as a problem that needs to be fixed simply by acquiring more NHPs. As discussed, there are substantial roadblocks to acquiring a greater supply, and several of them, such as issues associated with transportation from one nation to another, are beyond the control of the scientific community. Therefore, it makes sense to view the NHP shortage through a different lens—as an opportunity to enable research by heavily encouraging the use and development of, and dramatically increasing funding for, human-relevant, non-animal test methods.

Despite its general support for the use of NHPs, one of the conclusions reached in the NIH report was that even though there are currently no alternatives that can fully replace NHPs, there is reason to be optimistic because new approach methodologies continue to advance rapidly. In addition, the study concluded that “development and validation of new approach methodologies (in vitro and in silico model systems) is critically important to support further advances in biomedical research. This may reduce the need for nonhuman primate (NHP) models in the future, and/or enhance their utility. Additionally, this may help to mitigate shortages in NHP supply and the high cost of NHP research.”

As non-animal test methods such as organs-on-a-chip and organoids are developed, refined, mass-produced, and become affordable for U.S.-based laboratories to use, American research can progress without being hampered by a lack of NHPs, or a reliance on the countries that supply them. Furthermore, compared to traditional regulatory toxicology tests, studies indicate that non-animal test methods are cheaper, can be performed faster, are reliable, and do not subject animals to confinement and suffering. Moreover, the shortage of NHPs should induce the federal government to fund and promote the use of non-animal test methods that are available to laboratories, including stem cells, in silico models, and micro dosing.

As noted above, the NIH report states that NHPs “remain important animal models” and that the NHP shortage “has had negative impacts on biomedical research necessary for both public health and national security.” While these problems could be addressed in the short term by increasing the NHP supply, a long-term solution must focus on the underlying science and on how to obtain comparable information without NHPs. As this report notes, NAMs are not sufficiently available to completely replace animal models. To reduce reliance on NHPs, additional resources are needed including a process for NAM validation.

Taken together, these reports show a way to improve regulatory toxicology, increase scientific knowledge, and substantially reduce animal use. The TT21C report created a movement at EPA toward the discovery and deployment of NAMs in the TSCA program. During the past 16 years, progress has been made in moving away from vertebrate animal testing toward a regulatory toxicology regime that is more human-centric, faster, more reliable, and more potentially useful for EPA risk assessments. Provisions added to TSCA in 2016 catalyzed EPA’s movement toward NAMs by requiring an agency workplan and a list of acceptable NAMs. On the other hand, the NIH report demonstrates some of the barriers and challenges that remain in completing the transition to the vision outlined in TT21C. While the NIH report endorses the value of NHP research and recommends increasing the supply for science, its discussion and analysis of NAMs should not be overlooked. Although replacing NHPs might not be possible in the short term, it should be a long-term goal. To achieve it, federal government agencies and Congress should consider taking the following actions.

First, there must be an investment of resources in the development of NAMs, especially for more complex health endpoints. EPA’s NAMs list is populated mostly with tests for simpler biological endpoints. While important, the real challenge will be to develop NAMs for complex endpoints such as reproductive and developmental impacts, cancer, and neurological diseases. Models for some of these endpoints—such as brain organoids—are in use now and have provided important insights. For example, a brain organoid model was used to demonstrate that COVID-19 could infect human brain tissue. This is consistent with the current state of the science, as described in the NIH report, which notes that NAMs have provided useful information but cannot yet fully replace NHPs in research. One possible way forward is to dedicate a small portion of the NIH budget to NAMs. A 2 percent solution would draw researchers to the field. Training on how to use NAMs should also be a target of this funding.

Second, Congress should consider applying the TSCA NAMs approach to other statutes so that more federal agencies are required to mount a NAMs program. While such a program could take different forms depending on the agency and its mandates, two key federal organizations should be prioritized. As the dominant research funder in the nation, NIH should be required to develop a strategic plan for NAMs. Although NIH is not primarily a regulatory body, its funding supports thousands of researchers throughout the country, including tens of thousands of students. Part of its strategic plan could be a list of NAMs used by those it funds, and an assessment and description of these NAMs. Bills have been introduced in Congress that seek to prod NIH on alternatives. For example, in 2021 Representative Vern Buchanan (R-FL) and the late Alcee Hastings (D-FL) introduced the Humane Research and Testing Act. Among other things, this bill would have established a new institute dedicated to NAMs housed under NIH. Another important agency is the Food and Drug Administration, which could play a much larger role in the transition to NAMs. FDA published a predictive toxicology roadmap in 2020 that calls out NAMs as promising new technologies. Last year, new legislation was enacted called the FDA Modernization Act 2.0. This law removed a requirement for animal testing in new drug protocols and replaced it with language that allowed FDA to consider the best science. Despite these changes, though, the FDA has yet to accept data from a non-animal alternative in the drug development process. Congress could require FDA to establish a work plan for alternatives to accelerate their use in the agency’s regulatory processes.

Third, federal agencies must clearly explain how NAMs should be validated and, for regulatory agencies, must outline the steps to be taken so that NAMs can be listed as satisfactory-for-use in decisionmaking. The NIH report touches on some of these issues. In chapter 4, the report explains the concepts of context-of-use, qualification, and validation. Validation is of particular importance. It is explained in the NIH report as “the process by which the reliability and relevance of a technology or approach is established for a defined purpose using specific criteria,” which is adapted from the definition of validation provided by the Organization for Economic Cooperation and Development. Once validated, a methodology is deemed to have been blessed by an agency as one that produces robust and reliable data. This conceptual explanation needs to be transformed by regulatory agencies into a procedure that NAM developers can follow. A method reaches regulatory acceptance if it is routinely used by the agency for decisionmaking. Without clarity on how these concepts are operationalized, however, it will be extremely difficult to design NAMs for decisionmaking.

Regulatory toxicology 1.0 emerged after it became clear that the nation needed a way to assess the dangers of exposure to chemicals and radiation. It served its purpose well in the 20th century, but it is now time to recognize that the environmental protection challenges of the 21st century require more. Regulatory toxicology 2.0 is evolving. It is centered on human biology and uses far fewer non-human animals. Further it is potentially less expensive, faster, and has the promise of being more predictive and protective of human and environmental health. The two NASEM reports discussed here capture key features of 21st century regulatory toxicology, as well as demonstrate current gaps and how to address them.

LAB REPORT Chemical testing must evolve if it is to meet growing demand for data. A new system centered on human biology will result in the harm of fewer animals and be less expensive, faster, and more predictive. Testing using non-animal, human-relevant methods would also better protect public and environmental health.

Forever Chemicals: Wherein Law, Science Confront the Same Problem
Author
Craig M. Pease - Former Law School Professor
Former Law School Professor
Current Issue
Issue
2
Craig M. Pease

Last fall, the U.S. Supreme Court denied cert in Abbott v. DuPont, litigation involving cancer and perfluorooctanoic acid, or PFOA. That chemical is a PFAS, a class of compounds used in non-stick and water-repellent goods, and known as “forever chemicals.” Considered narrowly, Abbott is an insignificant law-and-science backwater; the issue at bar was nonmutual offensive collateral estoppel, and it had a rather idiosyncratic posture.

Abbott’s legal and scientific genealogy traces back to the Leach v. DuPont class action, alleging defendant’s release of PFOA from its Parkersburg West Virginia, facilities along the Ohio River caused cancer and other diseases in nearby residents. In their 2005 settlement, the parties agreed to sponsor further epidemiological studies.

The science panel so appointed completed its work in 2012, finding a link between PFOA and kidney and testicular cancer. For most diseases, it “[did] not find a probable link.” Those no-link conclusions seem reasonable enough. Even so, one needs to be cautious about extrapolating them more broadly:

The study sample size of about 30,000 seems large and adequate. Yet, as is typical in such epidemiological studies, the number of people in the sample with any particular disease was often quite small. For example, the birth defect studies had positive-result sizes of 7, 12, and 449. And those 449 positives relied on reconstructing blood PFOA levels, rather than measuring them directly.

Beyond that, the science panel made only passing reference to a smattering of evidence from genetics, and cell and developmental biology. Yet knowledge of toxin mechanisms of action has continued to burgeon in the decade since the Leach studies, and is every bit as informative as epidemiological evidence.

The courts have consolidated the kidney and testicular cancer cases from the original class action into multi-district litigation. The Abbott action arose out of a multi-district-litigation settlement carve-out, a so-called MDL.

Viewed more broadly, no class action or MDL toxics case is insignificant. And MDL has shouldered much of the burden. Going back decades, tort defendants have a solid record of successfully defending class actions by challenging class certification. Consider tobacco and cancer litigation. Hundreds of thousands of toxics cases, consolidated in various MDLs, are currently pending in federal courts, including asbestos, glyphosate (RoundUp), and PFAS.

In class actions and MDL, the judiciary must somehow efficiently resolve large numbers of somewhat similar cases. These types of suits both accomplish this by grouping cases, so each case does not have to be decided entirely on its own merits. When a court certifies a class or applies collateral estoppel, it makes a decision about how narrowly, or broadly, to group somewhat similar cases.

Yet at the very core of the PFOA science are decisions---made by scientists---to group, or not, data. Testicular cancer, for example, is not one single disease, but rather consists of various cancer subcategories and stages. And testicular cancer is itself only a little part of broader disease categories like cancer or even cell growth disorders, which might be reasonably construed so as to also encompass birth defects.

Decisions about how narrowly or broadly to group data has a huge impact on whether science reports a link between PFOA and disease. Group too narrowly, and because of too small sample size, no pattern would ever be statistically discernible. Group too broadly, and one risks having a cause-effect relation actually present swamped by dissimilar cases. This is in no way a criticism of the Leach science panel, who made such decisions consistent with standard practice.

Scientists group data for exactly the same reason the law groups cases: Science is expensive. Scientists have limited resources. Just as those considerations constrain the judiciary. Superficially, law and science appear to be distinct inquiries, each with its own domain-specific knowledge, rules, and procedures. In one sense, that is true. Yet one lesson of this PFOA litigation is that the law does not—and indeed cannot—ever really circumvent hard decisions by passing them off to science.

When the law referred the Parkersburg PFOA data to the Leach science panel, the core problems of limited resources and the need for economic efficiency did not magically disappear. That referral entailed that those profound and unavoidable issues be resolved in substantial part by scientists, using the protocols of science, as the scientists made decisions about how to group the scientific data, or not.

Forever Chemicals: Wherein Law, Science Confront the Same Problem.

Environment Silent Victim of War in Ukraine, for Decades to Come
Author
Stephen R. Dujack - Environmental Law Institute
Environmental Law Institute
Current Issue
Issue
5

One of the silent victims of the war in Ukraine is the environment. Artillery shells and rockets fired by both sides are exploding in the thousands every day, distributing pollutants into the soil, water, and air. Secondary releases include asbestos and other harmful substances from the industrial facilities and civilian buildings that Russian shells and rockets target. These armaments have leveled cities and critical infrastructure, including water and power supplies, and disrupted entire ecosystems. Tanks and other tracked vehicles chew up the countryside and take out trees that impede them. Lately, Ukraine’s tanks have been firing armor-piercing shells containing depleted uranium, a heavy metal that poisons the soil.

All told, munitions, targeting choices, and collateral damage in Ukraine will leave “a toxic legacy for generations to come,” in the words of the UN Environment Programme. “The mapping and initial screening of environmental hazards only serves to confirm that war is quite literally toxic,” says UNEP Executive Director Inger Andersen. “The environment is about people: it’s about livelihoods, public health, clean air and water, and basic food systems. It’s about a safe future for Ukrainians and their neighbors, and further damage must not be done.”

Russian troops have seized the Zaporizhzhia Nuclear Power Station in southeastern Ukraine, encouraging fear among locals and the world community that the invaders may cause a deliberate nuclear crisis. Meanwhile, Russia has launched attacks from the facility, effectively weaponizing civilian nuclear energy generation.

And Russia is presumed to be behind the bombing last spring of the Kakhovka hydroelectric dam in southern Ukraine, an action that could run afoul of the law of war. It was “an outrageous act of environmental destruction that will have profound humanitarian consequences,” according to the U.S. State Department. “The Russian Federation’s willful disregard for civilian life and infrastructure has been a hallmark of its military campaign since the very beginning of its war on Ukraine.”

After the dam burst, “thousands of people were displaced by flooding from one of the world’s largest reservoirs, which was vital for irrigating farmland considered the breadbasket of Europe,” according to the New York Times. Adding to that problem is the fact that Moscow failed to renew the agreement that allowed grain vessels free passage to Ukraine’s Black Sea ports, pushing up global wheat prices and exacting an economic toll on the world’s poor, for whom grain is a principal part of their diets.

Worst of all is the possible use of nuclear weapons. Russian President Vladimir Putin has on several occasions since the invasion began threatened to deploy “tactical” atomic devices. NATO, including the United States, has carefully calibrated its support of the regime in Kyiv to not force Moscow’s hand. The use of a tactical nuclear bomb—a weapon designed for the battlefield, not to take out enemy cities and thereby force a surrender—would be a first in history. These weapons were developed by the United States starting in the 1950s for use in an all-out land invasion in Europe by the Warsaw Pact. The Soviets too had thousands deployed in Eastern Europe for a showdown with the West. The end of the Cold War eased tensions, and there hadn’t been a nuclear saber-rattling until Putin’s army got bogged down.

At this writing, Kyiv’s long-heralded counter-offensive is making little progress against dug-in defenses of seized territory in eastern Ukraine. The Russians have used thousands of mines to slow down any enemy advance of infantry and armor. The Anti-Personnel Mine Ban Convention proscribes the use of these devices, which are difficult to disarm, making land unusable for decades. Kyiv too has made use of mines, but only in a limited manner. Ukraine is a party to the mine convention; Russia and the United States are not. The convention does not bind either of the two non-signatories.

Also worrisome are U.S. shipments of cluster munitions to Ukraine. Cluster bombs break apart and release dozens of submunitions, which fly off in all directions and explode on their own. In addition to the immediate effect on enemy soldiers, many of the submunitions fail to detonate but are still live, creating an environmental danger for generations to come. Because of this threat, the Convention on Cluster Munitions came into force in 2010 and has been ratified by 110 countries—but not Ukraine, Russia, nor the United States. Again, there is no binding law on the combatants here.

Experts predict it will take literally centuries for teams of sappers to rid Ukraine of these deadly munitions. According to the Washington Post, “President Biden had to invoke special national security grounds to bypass U.S. law that prevents the transfer of cluster bombs with a dud rate—that is, the share of unexploded bomblets that may remain in the ground—greater than 1 percent.” The newspaper notes that “current shipments are at a dud rate of 2.35 percent, though some experts believe the true rate is much higher.”

Even as the environmental damage grows, there are unprecedented efforts to document it. With the blessing of the UN General Assembly, more than 40 EU states, the United States, Japan, and Canada are creating a “register of damage.” Stay tuned: the United States has pledged to rid Ukraine of unexploded cluster submunitions, and there will be more developments to hold Russia accountable for wrongful environmental damage.

Mea Culpa

“Something Was Messing With the Earth’s Axis. The Answer Has to Do With Us.” I have to take that headline in the New York Times personally, as a potentially responsible party more at fault and liable for damages than most.

“Accelerated melting of the polar ice sheets and mountain glaciers had changed the way mass was distributed around the planet enough to influence its spin,” according to the article. Worse, scientists have now “identified another factor that’s had the same kind of effect: colossal quantities of water pumped out of the ground for crops and households.” We are all part of the problem, of course, but what followed singled me out: “Scientists knew the planet’s centerline could move. But it took a sharp turn sometime around the start of the 2000s.”

You see, I had stood exactly at the Earth’s axis and in fact had been witnessed to give it “a sharp turn.” In fact, it was also exactly at the start of the 2000s. I was visiting the South Pole to celebrate the millennium. It is the only place on Earth where you can select which time zone you want to bring in the new year, and do so in broad daylight. Indeed, all the time zones come together there, and if you want you can celebrate 24 times.

Surprisingly, there are ordinarily three South Poles to pick from, and on New Year’s Eve, when I arrived, for a few hours there are four. All are confusingly within yards of each other and the nearby housing, labs, and offices of the U.S. National Science Foundation.

The least important of the South Poles is the only impressive one. It is a three-foot-tall segment of barbershop-striped cylinder stuck in the ice, topped by one of those mirrored spheres sold as lawn ornaments. At a respectful distance around this odd monolith are flag poles displaying the banners of the principal members of the Antarctic Treaty. The problem is that it isn’t really at the South Pole. Another lies a very short distance away and has a better claim at authenticity. Rather, it’s just the colorful spot you go to for the usual snapshots.

The real South Pole, the spot where all lines of longitude converge, is marked by two sections of pipe sunk in the snowpack about four feet apart. The pair hold a sign commemorating the arrival of the first humans to that spot. The longitudinal confluence lies vaguely between these two pipes. So it’s not a very satisfying marker.

And that lack of precision was not all that was wrong with this alleged South Pole: the Antarctic ice cap, which is 8,000 feet thick at that point, moves laterally by an inch a day. So this marker only lasts a few hours as the legitimate South Pole. And in fact there is an annual tradition to respot the pipes and sign on New Year’s Day.

We had arrived in our little Twin Otter plane eight hours before the millennium. We found a group of technicians using a highly accurate GPS device with a huge parabolic antenna to mark where the South Pole was at exactly that time. When they were done they pushed a foot-long bamboo stick into the snow, with a red ribbon tied to its top. It was indeed a weak reed for such a momentous occasion.

After midnight passed, as technicians moved the metal sign and its two pipes to straddle the bamboo stick, a physicist told me he understood my concern about not finding an accurate and impressive marker for the convergence of longitude. And then he showed me the fourth South Pole—more real than any of the others. It was twenty feet away and consisted of a foot-high tapered metal spike stuck in the snow. It marked the axis of the Earth’s rotation, which no longer coincides with the lines of longitude laid down centuries ago due in part to the wobble resulting from human activities.

I put my hand on the top of the spike and gave it a twist.

Notice & Comment is the editor’s column and represents his opinions.

A Striking Indicator That Climate Change Is in Progress

Over the past several Major League Baseball seasons, home run numbers have climbed dramatically, including Aaron Judge’s record-breaking 62 homers for the New York Yankees in 2022.

[This outburst has] prompted speculation from the media about the connection between climate change and home runs.

But while scientists . . . have shown that balls go farther in higher temperatures, no formal scientific investigation had been performed to prove that global warming is helping fuel baseball’s home run spree – until now. . . .

In our study, published in the Bulletin of the American Meteorological Society . . . we used data from over 100,000 Major League Baseball games and 200,000 individual batted balls, alongside observed game day temperatures, to show that warming temperatures have, in fact, increased the number of home runs. —The Conversation

The start of industrial-scale seabed mining to extract car battery metals from the bottom of the Pacific Ocean has been delayed after the international agency charged with overseeing the work concluded . . . that it needed more time to finalize mining rules. The action by the International Seabed Authority . . . came after pressure from environmentalists and nations that oppose the effort. —Eric Lipton/New York Times

Environment Silent Victim of War in Ukraine, for Decades to Come.

The Marriage of Toxic Chemicals and Plastic
Author
Boma Brown-West - Environmental Defense Fund
Environmental Defense Fund
Current Issue
Issue
2
Parent Article

Long before plastic waste litters our land and waterways, it has already caused serious environmental and human health impacts because of its oft-ignored linkage to toxic chemicals. We cannot craft a sustainable future for plastic until we address the toxic chemicals emitted during plastic manufacturing and used in plastic products and packaging.

The problem starts early in the production process. Among all refineries and petrochemical facilities in the United States, the top 10 most polluting are ones that make chemical feedstocks for the plastics industry. Industrial plants that manufacture polymers and chemical additives that go into plastics emit large volumes of hazardous air pollutants. The carcinogen ethylene oxide, the most hazardous of these pollutants, is used to make many products, including plastic water bottles, and is emitted into the atmosphere in large volumes during production.

The people who live on the fenceline of petrochemical facilities—often low-wealth communities and communities of color—bear the brunt of these emissions. In the Houston Ship Channel, for example, some residents live within five kilometers of more than 40 petrochemical plants. These neighborhoods suffer from disproportionately higher rates of asthma, other respiratory illnesses, and cancer.

The plastic products we use every day, from toys to shampoo bottles, contain toxic chemicals too. For example, phthalates, which are ubiquitous because they enhance the flexibility of many types of plastic, increase the risk of endocrine, reproductive, and developmental disorders. They have been detected in the bodies of over 95 percent of Americans, according to the Centers for Disease Control and Prevention’s National Health and Nutrition Examination Survey. Polybrominated diphenyl ethers were used for decades as flame retardants in many plastic-containing products like furniture and electronics. PBDEs are linked to health issues that include cancer and neurodevelopmental disorders.

Food is another source of exposure. Toxic chemicals in the packaging and handling equipment used at every stage of the food production system pose a risk to our health and the environment. For instance, phthalates migrate into dairy products not only from food packaging like plastic yogurt containers, but also from the tubing used when milking cows. According to EPA, contaminated food is responsible for more than 80 percent of our exposure to the endocrine disruptor perchlorate, in part because the chemical is used as an anti-static agent for dry-food plastic packaging in food facilities and stores. CDC studies indicate perchlorate is present in the body of every American.

Per- and poly-fluorinated alkyl substances, known as the “forever chemicals,” persist in the environment for decades and are associated with health issues such as cancer, immune disorders, and developmental disorders. Virtually all Americans have PFAS in their bodies. Though often associated with paper packaging, these chemicals are a major part of plastic packaging production, present in hundreds of millions of polyethylene and polypropylene containers, including those used for food.

When we recycle plastic conventionally, these toxic chemicals remain. Chemical recycling will also not solve the problem, as the process returns plastic materials to a virgin-like state—before the typical downstream conversion and manufacturing steps when chemical additives would be introduced.

So where do we go from here? Within their jurisdictions, EPA and the Food and Drug Administration need to move faster at re-evaluating the safety of existing chemicals in commerce. New data exists on several ubiquitous chemicals in plastic, demonstrating their serious environmental and chronic health effects. It’s time to follow the data and restrict or ban use of the most harmful chemicals. We also need our safety agencies to implement stronger mechanisms that ensure new chemicals don’t produce similar or worse impacts.

To reduce harm to fenceline communities, we need EPA to close dangerous loopholes in the Clean Air Act that allow polluting facilities to operate without proper emission controls. State agencies also need to enforce company violations with more rigor and conduct more thorough permitting reviews, particularly when new facilities or expansions are planned near communities that are already surrounded by industrial plants.

Companies along the entire plastics supply chain—including oil companies, chemical manufacturers, converters and producers, brand manufacturers, and retailers—must acknowledge that toxic chemicals are part of their environmental impact. To improve, they must reduce their production, use, and emissions of toxic chemicals. The first step is identifying where toxic chemicals exist across their value chain—in their final products, upstream and final packaging, manufacturing processes, and, importantly, their supply chain.

If plastic is to play any role in a healthy and sustainable future, we can no longer ignore the damage of toxic chemicals used in products and emitted during manufacturing processes. We must determine, once and for all, if and how plastic can be decoupled from toxic chemicals.