For every kilowatt-hour of power that Internet-linked computers use, they save at least 10 times that amount, a recent study finds.
The rate at which the United States is becoming more energy-efficient has soared since 1995, when the computer-based Internet and communications revolution began soaking into US society.
That conclusion – from a groundbreaking study by the American Council for an Energy-Efficient Economy (ACEEE) last week – stands in sharp contrast to recent concerns that the computer backbone of the Internet was gobbling up huge amounts of energy.
Indeed, all America's servers – the computers that direct traffic on the Internet – and the systems that cool them use about 1.2 percent of the nation's electricity, according to a study last year. That's still a lot of power, comparable to the energy used by color TVs in the US.
But it turns out that for every kilowatt-hour of electricity used by information and communications technologies, the US saves at least 10 times that amount, the new ACEEE report found.
"Acceleration of information and computer technology across the US landscape post 1995 is driving much of the nation's energy-productivity gain," says John Laitner of the ACEEE and coauthor of the study. "Had we continued at the historic rate of prior years, we would today be using the energy equivalent of 1 billion barrels of oil more [per year] than we were" in the early 1990s.
After the oil embargoes of the 1970s, America quickly became more efficient and its "energy intensity" fell sharply. Energy intensity is the amount of energy required to produce a dollar of economic output. But its efficiency improvements slowed to less than 1 percent per year between 1986 and 1996.