Gulf oil spill and Fukushima nuclear plant: A challenge to humans and technology
In one year, energy disasters in the Gulf and at Fukushima point to the challenge of human control over complex technology.
In just the past year, the world has had to contend with two major disasters in the energy sector. While the BP Gulf oil spill and Fukushima Daiichi nuclear plant crises differ in type, they both point to the challenge of human control over big, complex technologies that also deal with powerful forces in nature.
In the march of human progress, itâ€™s remarkable how few calamities on the planet can be chalked up to human error with technology. Planes fly safely all over the globe (except when, for instance, theyâ€™re endangered by humans asleep in an air-traffic control tower). Natural disasters such as earthquakes and tsunamis can cause instant and massive loss of human life.
But the world is entering a phase where technology that is becoming ever more complex could endanger livelihoods on a large scale â€“ global warming, for instance, or breaches of cybersecurity. How does humanity control such outcomes, instead of being controlled by them?
First, itâ€™s important to investigate when a disaster on the scale of the BP spill or Fukushima could have been avoided. Sometimes that means hitting the â€śpauseâ€ť button, as President Obama did with his deep-water drilling moratorium and as China and Germany are doing with nuclear power. That allows time to find out what went wrong and how to fix the problem.
The fix might involve an engineering change, as it did with the cold-sensitive O-ring that caused the space shuttle Challenger to blow up during launch in 1986. In the case of BPâ€™s oil well, which started spewing crude on April 20, 2010, it turns out that the fail-safe â€śblowout preventer,â€ť for instance, had a design flaw. That needs to be corrected.
But the required change is often more than just technical. After all, humans create technology, and when their designs lack foresight for even the unforeseeable, they pay a steep price.
The coastal Fukushima plant, for instance, was not designed to withstand a tsunami that would top the plantâ€™s sea walls. Even though Japan lies in a geological zone that produces Earthâ€™s largest earthquakes, it was not imagined that the ocean would wipe out the backup generating power needed to cool the fuel rods. Such an assumption will have to change.
Because neither the BP blowout-preventer failure nor the tsunamiâ€™s size were imagined, disaster recovery measures werenâ€™t either. It took months to shut down the BP well, and it may well take months to contain Fukushimaâ€™s dangers.
It could be argued that some technology has gotten so complex that itâ€™s beyond control. Itâ€™s hard to imagine rolling back to a simpler time before, say, the Internet, although a segment of the population actively strives to lead simpler lives.
In the energy sector, simpler might mean localized â€“ more â€śneighborhoodâ€ť sources of power or even individual sources, such as homes built with photovoltaic roofs or geothermal heating.
Ultimately, more complexity has to mean more careful oversight and transparency. As a result of the BP oil spill, the part of the federal government that oversees drilling was separated from the office that handed out licenses to drill. That reform should have been done long ago.
At its most basic, oversight is a matter of staying alert. Thatâ€™s admittedly hard to do when things seem to be running smoothly, whether thatâ€™s in the run-up to the 2008 mortgage debacle or the few years before the BP disaster, when wells in deep waters produced 80 percent of the Gulf of Mexicoâ€™s oil.
In this age, one tends to separate technology from human input. Workers on a rig arenâ€™t thinking about the engineering that goes into a blowout preventer. Japanese who turn on the lights arenâ€™t thinking about backup generators at the nuclear plant miles away.
Until technology fails. Then people realize that humans design technology, and that they must do a better job of imagining and planning for what can go wrong with their inventions.