Editor’s note: Stephanie Malin, an assistant professor of sociology at Colorado State University, wrote this piece for The Conversation in February 2018. Colorado State is a contributing institution to The Conversation, an independent collaboration between editors and academics that provides informed news analysis and commentary to the general public. See the entire list of contributing faculty and their articles here.
Uranium – the raw material for nuclear power and nuclear weapons – is having a moment in the spotlight.
Companies such as Energy Fuels, Inc. have played well-publicized roles in lobbying the Trump administration to reduce federal protection for public lands with uranium deposits. The Defense Department’s Nuclear Posture Review calls for new weapons production to expand the U.S. nuclear arsenal, which could spur new domestic uranium mining. And the Interior Department is advocating more domestic uranium production, along with other materials identified as “critical minerals.”
What would expanded uranium mining in the U.S. mean at the local level? I have studied the legacies of past uranium mining and milling in Western states for over a decade. My book examines dilemmas faced by uranium communities caught between harmful legacies of previous mining booms and the potential promise of new economic development.
These people and places are invisible to most Americans, but they helped make the United States an economic and military superpower. In my view, we owe it to them to learn from past mistakes and make more informed and sustainable decisions about possibly renewing uranium production than our nation made in the past.
Mining regulations have failed to protect public health
Today most of the uranium that powers U.S. nuclear reactors is imported. But many communities still suffer impacts of uranium mining and milling that occurred for decades to fuel the U.S.-Soviet nuclear arms race. These include environmental contamination, toxic spills, abandoned mines, under-addressed cancer and disease clusters and illnesses that citizens link to uranium exposure despite federal denials.
As World War II phased into the Cold War, U.S. officials rapidly increased uranium production from the 1940s to the 1960s. Regulations were minimal to nonexistent and largely unenforced, even though the U.S. Public Health Service knew that exposure to uranium had caused potentially fatal health effects in Europe, and was monitoring uranium miners and millers for health problems.
Today the industry is subject to regulations that address worker health and safety, environmental protection, treatment of contaminated sites and other considerations. But these regulations lack uniformity, and enforcement responsibilities are spread across multiple agencies.
This creates significant regulatory gaps, which are worsened by a federalist approach to regulation. In the 1970s the newly created Nuclear Regulatory Commission initiated an Agreement States program, under which states take over regulating many aspects of uranium and nuclear production and waste storage. To qualify, state programs must be “adequate to protect public health and safety and compatible with the NRC’s regulatory program.”
Today 37 states have joined this program and two more are applying. Many Agreement States struggle to enforce regulations because of underfunded budgets, lack of staff and anti-regulatory cultures. These problems can lead to piecemeal enforcement and reliance on corporate self-regulation.
For example, budget cuts in Colorado have forced the state to rely frequently on energy companies to monitor their own compliance with regulations. In Utah, the White Mesa Mill – our nation’s only currently operating uranium mill – has a record of persistent problems related to permitting, water contamination and environmental health, as well as tribal sacred lands and artifacts.
Neglected nuclear legacies
Uranium still affects the environment and human health in the West, but its impacts remain woefully under-addressed. Some of the poorest, most isolated and ethnically marginalized communities in the nation are bearing the brunt of these legacies.
There are approximately 4,000 abandoned uranium mines in Western states. At least 500 are located on land controlled by the Navajo Nation. Diné (Navajo) people have suffered some of the worst consequences of U.S. uranium production, including cancer clusters and water contamination.
A 2015 study found that about 85 percent of Diné homes are still contaminated with uranium, and that tribe members living near uranium mines have more uranium in their bones than 95 percent of the U.S. population. Unsurprisingly, President Donald Trump’s decision to reduce the Bears Ears National Monument has reinvigorated discussion over ongoing impacts of uranium contamination across tribal and public land.
Despite legislation such as the Radiation Exposure Compensation Act of 1990, people who lived near uranium production or contamination sites often became forgotten casualties of the Cold War. For instance, Monticello, Utah, hosted a federally owned uranium mill from 1942 to 1960. Portions of the town were even built from tailings left over from uranium milling, which we now know were radioactive. This created two Superfund sites that were not fully remediated until the late 1990s.
Monticello residents have dealt with cancer clusters, increased rates of birth defects and other health abnormalities for decades. Although the community has sought federal recognition and compensation since 1993, its requests have been largely ignored.
Today tensions over water access and its use for uranium mining are creating conflict between regional tribes and corporate water users around the North Rim of the Grand Canyon. Native residents, such as the Havasupai, have had to defend their water rights and fear losing access to this vital resource.
Uranium production is a boom-and-bust industry
Like any economic activity based on commodities, uranium production is volatile and unstable. The industry has a history of boom-bust cycles. Communities that depend on it can be whipsawed by rapid growth followed by destabilizing population losses.
The first U.S. uranium boom occurred during the early Cold War and ended in the 1960s due to oversupply, triggering a bust. A second boom began later in the decade when the federal government authorized private commercial investment in nuclear power. But the Three Mile Island (1979) and Chernobyl (1985) disasters ended this second boom.
Uranium prices soared once again from 2007 to 2010. But the 2011 tsunami and meltdown at Japan’s Fukushima Dai-ichi nuclear plant sent prices plummeting once again as nations looked for alternatives to nuclear power.
Companies like Energy Fuels maintain – especially in public meetings with uranium communities – that new production will lead to sustained economic growth. This message is powerful stuff. It boosts support, sometimes in the very communities that have suffered most from past practices.
But I have interviewed Westerners who worry that as production methods become more technologically advanced and mechanized, energy companies may increasingly rely on bringing in out-of-town workers with technical and engineering degrees rather than hiring locals – as has happened in the coal industry. And the core tensions of boom-bust economic volatility and instability persist.
Uranium production advocates contend that new “environmentally friendly” mills and current federal regulations will adequately protect public health and the environment. Yet they offer little evidence to counter White Mesa Mill’s poor record.
In my view, there is little evidence that new uranium production would be more reliably regulated or economically stable today than in the past. Instead, I expect that the industry will continue to privatize profits as the public absorbs and subsidizes its risks.