"For a long time we've been very fortunate," says John Maltabes, a visiting scholar at Hewlett-Packard with 30 years of experience in chip manufacturing. "What's happening now is economics are catching up."
Moore's law is slowing down, and it affects every computer, from the smallest to the largest. Scientists have relied, for example, on supercomputers to study particle physics and the brain – not to mention ensuring the safety of aging nuclear weapons. But grander scientific questions bring bigger price tags. Tianhe-1A, the world's fastest supercomputer at China's National Supercomputer Center in Tianjin, wolfs 4 million watts of electricity. The next-generation machine planned by the US Department of Energy will guzzle 25 million watts (the original blueprint was to consume 130 million watts, or $130 million of electricity per year, before downsizing to make its utility bills affordable).
"Computers are pretty much energy-limited now," concludes Furber.
The almost-intelligent software that allows Watson to win at "Jeopardy!" remains an impressive feat, and it will be used where the cost can be justified – like sifting through reams of financial information to suggest stock purchases at investment banks; searching stacks of legal documents to find the pearl of evidence that will win a court case; or tweaking the coming and going of planes, trains, and buses so that travelers don't miss connections. But before something as smart as Watson comes to ordinary peoples' laptops and smart phones, engineers must build more-efficient computers that circumvent current energy limits. Whichever technology succeeds will push the world to a different place.