The timeline of decisions by operators of the Fukushima Dai-ichi nuclear power plant reveals the inherent weakness of the most critical software in these crises — the human brain. As with decision-making during the Chernobyl and Three Mile Island nuclear plant crises, bad assumptions and fear of taking irreversible steps may have hampered officials at the Japanese plant.
The plant was not designed to survive the 40-foot wave that hit it, but in a review of what followed, workers made some ill-fated choices.
National Public Radio reports that when the backup generators failed and pumps no longer could move water around the core to keep it cool, workers ran to a nearby parking lot littered with cars wrecked by the wave and began removing batteries. The batteries were wired to plant instruments and to pumps, but it was not enough.
That response showed creativity and decisiveness, but the workers may have been hampered by cultural forces. The Japanese decision-making process generally relies on group, and not individual conclusions. The country’s initial investigation revealed disagreement and confusion over who was in charge. A more hierarchical structure might have been better suited to the plant.
At one point, engineers could have flooded the reactors with seawater to keep them cool. They knew that doing so would ruin the reactors, and so hesitated, probably for too long, before ultimately doing so.
“It’s quite likely that if the injection of seawater had been initiated earlier, the damage of fuel could have been limited greatly or even prevented,” Per Peterson, chairman of nuclear engineering at the University of California Berkeley, told NPR.
Marvin Fertel, president of the Nuclear Energy Institute, said the Japanese did not prepare staff adequately for disasters. The typical day for operators, he said, is marked by boredom: repetitive tasks, monitoring gauges and the like. What the Japanese needed, Mr. Fertel told NPR, were simulators with which staff could face disasterlike scenarios.
Similarly, the 1986 disaster at the Chernobyl nuclear power plant, the worst ever, was blamed largely on a poorly built plant, but also on poorly trained staff.
The Three Mile plant in Pennsylvania, which experienced a partial meltdown in 1979, also was ripe for analysis of the human element. For some unknown reason, pumps stopped working, which in turn prevented steam generators from removing heat. As pressure began to build, a relief valve opened.
“The valve should have closed when the pressure decreased,” according to a Nuclear Regulatory Commission summary, “but it did not.” Workers mistakenly believed it had closed, because a pressure gauge near the valve showed water was present, but in fact the gauge was measuring water leaving the core. Staff “took a series of actions that made conditions worse,” due to confusing information generated by instruments, the NRC concluded.
It wasn’t until the next shift of workers arrived and suggested the valve could be stuck open was the problem addressed.
Whether nuclear power is an option to address our future energy needs remains an open question. But it is clear that removing the human element is impossible. And because thinking is critical, training that prepares operators for the worst is essential.