The precautionary principle ("better safe than sorry") is a maxim embraced by government planners and regulators the world over, from GMOs to pharmaceuticals to the environment. The argument is that it's better to act on fears preemptively than it is to "do nothing" and wait until there's a problem.
But often, overreaction can be more costly than the original problem. Precaution becomes panic, and moderating risk devolves into the blind urge to "just do something."
An article by George Johnson in the New York Times explains the deadly consequences of the Japanese government's panicky stampede after the 2011 Fukushima nuclear accident.
No one has been killed or sickened by the radiation — a point confirmed last month by the International Atomic Energy Agency. Even among Fukushima workers, the number of additional cancer cases in coming years is expected to be so low as to be undetectable, a blip impossible to discern against the statistical background noise.
But about 1,600 people died from the stress of the evacuation — one that some scientists believe was not justified by the relatively moderate radiation levels at the Japanese nuclear plant. ...
“The government basically panicked,” said Dr. Mohan Doss, a medical physicist who spoke at the Tokyo meeting, when I called him at his office at Fox Chase Cancer Center in Philadelphia. “When you evacuate a hospital intensive care unit, you cannot take patients to a high school and expect them to survive.”
Among other victims were residents of nursing homes. And there were the suicides. “It was the fear of radiation that ended up killing people,” he said.
Doss estimates that in the hot spots, with the highest levels of radioactivity, residents would have gotten 70 millisieverts of radiation over four years (a dose equal to one full body scan a year).
But those hot spots were anomalies.
By Dr. Doss’s calculations, most residents would have received much less, about 4 millisieverts a year. The average annual exposure from the natural background radiation of the earth is 2.4 millisieverts. ...
A full sievert of radiation is believed to eventually cause fatal cancers in about 5 percent of the people exposed. Under the linear no-threshold model, a millisievert would impose one one-thousandth of the risk: 0.005 percent, or five deadly cancers in a population of 100,000.
About twice that many people were evacuated from a 20-kilometer area near the Fukushima reactors. By avoiding what would have been an average cumulative exposure of 16 millisieverts, the number of cancer deaths prevented was perhaps 160, or 10 percent of the total who died in the evacuation itself.
That would be bad enough, but it's not clear if the "linear model" of radiation exposure is even accurate. The assumption is that all radiation is equally bad for you, and there's no safe level of exposure, so your risk is in exact proportion with the level of exposure.
But Doss and others think low levels of radiation are not proportionally as bad for you as high levels, meaning that half the radiation is less than half as dangerous. In other words, a millisievert is not a thousandth as deadly as a full sievert — it's much less than that, and it might not be bad for you at all.
"Better safe than sorry" sounds reasonable, but it can't answer the question better for whom? When we're talking about government policy, we have to remember that politicians and officials are not robots, blindly calculating the public good: they are people, with their own interests and incentives.
When there is a crisis (real or imagined), officials need to appear to do something. To keep us safe. To protect us from scary things, like radiation, toxins, and terrorists. That incentive is not identical to (and often not even compatible with) a rational cost-benefit analysis.
The evacuation of Fukushima was better for the officials in charge — they were doing something — but it wasn't safer for the people who died after being forcibly displaced.
There's no perfect solution here. But it would have been much better to give people the freedom to move at their own pace and let them make informed choices about the risks, instead a rushed, terrifying evacuation from an inflated threat.
Johnson concludes, "We’re bad at balancing risks, we humans, and we live in a world of continual uncertainty. Trying to avoid the horrors we imagine, we risk creating ones that are real."
I might add that when we mix them with politics, we risk inflicting them on everyone.