The knack for corralling electrons into an orderly dance of information has improved dramatically since transistors first appeared around 1950. The number of transistors on a chip doubled every two years – a trend called Moore's law. Those shrinking transistors cost less to manufacture and consumed less energy – making them cheaper to use. Today's transistors are smaller than a red blood cell, by a factor of 150, and consume less energy – by a factor of 50 billion per calculation – than the vacuum tubes in World War II-era computers.
Shrinking energy dissipation made it possible to use electronics in ever-expanding ways: hearing aids that wouldn't cook Grandma's ear; radios that could run for weeks on batteries; and the 1971 BusiCom desktop calculator that let computers compete cost-effectively with paper and pencil.
"All of this consumer electronics has only become possible because we've made these spectacular strides in energy efficiency," says Stephen Furber, a chip designer at the University of Manchester in England. Professor Furber's own contribution, the ARM processor chip that he helped design at Acorn Computers in the 1980s, reduced power consumption by a factor of 10, allowing millions of people to carry cellphones that don't incinerate their pocket lint or require hourly charging. In this day of on-the-run tweets, status updates, and cellphone photos, it could be argued that these phones and their low-power chips are responsible for making Facebook and Twitter the forces that they are today in pop culture, politics, and Arab revolutions.