As computer chips get faster, normal materials may not keep up – physics won’t allow it.
The heart of every computer made today is an integrated circuit (or “chip”) largely made of silicon. This common element, which makes up a quarter of the earth’s mass, can be found in such mundane items as beach sand and window glass. But in computer chips, silicon has had its brightest hour, powering a technological revolution that changed the world as much as the steam engine or the assembly line.
Using silicon, engineers have been able to pack more punch onto the same size chip, doubling the number of components on a given piece of silicon roughly every two years.
But soon the industry will hit a wall, scientists say. Silicon chips can only be stretched so thin. And as the individual components on a chip get smaller, engineers are reaching the bounds of what’s physically possible. Could silicon’s reign in the computer industry be drawing to a close?
“The real magic of integrated circuit technology has been that we can increase the density while reducing the cost,” explains Craig Sander, corporate vice president of technology development for Advanced Micro Devices (AMD), a chipmaker in Sunnyvale, Calif.
Chipmakers have pulled this off by figuring out new ways to cram more and smaller transistors onto a single chip. Transistors are basically tiny electrical switches and are the reason that computers use binary code – the 1s mean “on” and 0s mean “off.”
“By setting them up in different arrangements, engineers create a circuit that can store a value (for example, inside a memory chip) or perform a calculation (that could be used in a microprocessor),” says Mr. Sander. “The result is you get more for less, because we can so efficiently increase the density of transistors on a chip.”