We know that our computers use energy, and if we think about it, we recognize that all those servers that enable our e-mails to reach South America or Japan require plenty of power, too. But what we probably haven't envisioned is how much energy it takes to power the data centers that keep the Internet running. It's an estimated 152 kilowatt hours yearly, says an article in New Scientist magazine by Duncan Graham-Rowe.
That translates into approximately 2 percent of human-caused C02 emissions, or about the same as produced by the aviation industry. And the amount is likely to escalate in coming years as Internet traffic grows – one estimate is an increase of emissions by 280 percent by 2020.
According to a study conducted by Rich Brown, an energy analyst at the Lawrence Berkeley National Lab in California, and commissioned by the US environmental protection agency, "US data centers used 61 billion kilowatt hours in 2006, which is 1.5% of the entire electricity usage in the US," reports redOrbit.
This has led to calls such as: Web providers must limit Internet's carbon footprint, say experts. "In an energy-constrained world, we cannot continue to grow the footprint of the internet … we need to rein in the energy consumption," said Subodh Bapat, vice-president at Sun Microsystems."