Close

Chernobyl's 20,000-30,000 Fatalities

Where does The New York Times get the idea the human impact was minimal?

2 min read
Chernobyl's 20,000-30,000 Fatalities

Since at least the mid-1990s, the standard estimate of the long-term human impact of the Chernobyl catastrophe is that there would be between 20,000 and 30,000 premature deaths from leukemia and other cancers, almost entirely in the greater European region. So how have the New York Times's editorial writers got the idea that Chernobyl's impact was minimal?

Writing earlier this week, they said: "The latest evaluation — a United Nations committee in 2008 — concludes that emergency workers who struggled to bring the plant under control suffered great harm but the wider public was barely affected. In the three countries hit with the most fallout — Belarus, Ukraine and parts of the Russian Federation — the committee found that the only significant harm was several thousand cases of highly curable thyroid cancer among people who were exposed as children, mostly by drinking contaminated milk. Only a handful have died."

I don't know what UN evaluation the Times is referring to, but it seems obvious that they have misinterpreted something. And the probable explanation almost certainly has to do with a dilemma discussed here in Spectrum on the 25th anniversary of the Chernobyl catastrophe, several weeks ago. Not only are the premature fatalities from Chernobyl undetectable individually, so that there is no way of knowing which individuals in particular died as a result of the accident; the excess deaths are undetectable even as a group, because they are buried in the statistical "noise" associated with hundreds of millions of cancer deaths in the relevant time period.

So the problem, in a nutshell, is that not seeing is not believing. Since the deaths can't be identified or even measured, the Times's editorialists are treating them as if they're not occurring. They really ought to know better. At a time when countries routinely are led by Harvard or Yale law review editors (like Obama and the Clintons), chemists or physicists (like Thatcher or Merkel), or the long string of "enarchs" who almost always run France,* we can safely assume that policy is made with an awareness of statistics basics. Increasingly, for better or worse, the top journalists are educated at the same schools. So they ought to appreciate that if dose-response models predict a certain level of fatalities, those fatalities must be assumed to be occurring, even if they aren't seen.

To be sure, like the fatalities and illnesses known to be result yearly from exposure to coal emissions, the death and morbidity from nuclear accidents are not easy to factor into policy making. Nobody wants to weigh off deaths from nuclear accidents against deaths from pollution, and make choices on the basis of which technologies cause the most premature fatality. In the final analysis, as in air traffic regulation, policy will inevitably aim to eliminate deaths entirely. That has been the thrust of the EPA's coal regulation in recent decades, and it's bound to be the thrust of nuclear regulation as well.

In the long run, if nuclear energy is to remain viable, there can be no more Chernobyls or Fukushimas.

___________________________________________________

* The enarches are graduates of France's ultra-prestigious Ecole Nationale d'Administration (ENA).

The Conversation (0)

Smokey the AI

Smart image analysis algorithms, fed by cameras carried by drones and ground vehicles, can help power companies prevent forest fires

7 min read
Smokey the AI

The 2021 Dixie Fire in northern California is suspected of being caused by Pacific Gas & Electric's equipment. The fire is the second-largest in California history.

Robyn Beck/AFP/Getty Images

The 2020 fire season in the United States was the worst in at least 70 years, with some 4 million hectares burned on the west coast alone. These West Coast fires killed at least 37 people, destroyed hundreds of structures, caused nearly US $20 billion in damage, and filled the air with smoke that threatened the health of millions of people. And this was on top of a 2018 fire season that burned more than 700,000 hectares of land in California, and a 2019-to-2020 wildfire season in Australia that torched nearly 18 million hectares.

While some of these fires started from human carelessness—or arson—far too many were sparked and spread by the electrical power infrastructure and power lines. The California Department of Forestry and Fire Protection (Cal Fire) calculates that nearly 100,000 burned hectares of those 2018 California fires were the fault of the electric power infrastructure, including the devastating Camp Fire, which wiped out most of the town of Paradise. And in July of this year, Pacific Gas & Electric indicated that blown fuses on one of its utility poles may have sparked the Dixie Fire, which burned nearly 400,000 hectares.

Until these recent disasters, most people, even those living in vulnerable areas, didn't give much thought to the fire risk from the electrical infrastructure. Power companies trim trees and inspect lines on a regular—if not particularly frequent—basis.

However, the frequency of these inspections has changed little over the years, even though climate change is causing drier and hotter weather conditions that lead up to more intense wildfires. In addition, many key electrical components are beyond their shelf lives, including insulators, transformers, arrestors, and splices that are more than 40 years old. Many transmission towers, most built for a 40-year lifespan, are entering their final decade.

Keep Reading ↓ Show less