Share this story
Close X
Switch to Desktop Site

Why the age of quantum computing is nearer than you think

New research published out of the Max Planck Institute of Quantum Optics is one of the best examples of quantum computing beginning to flirt with practical technology.

Dell recently introduced three new XPS high preformance systems, including this XPS 700. To no fault of Dell's, none of these models would compare in the slightest to the most remedial quantum computer.


About these ads

Tech-buffs, investors, IT industrialists, and boffins alike eagerly await the day when the science of quantum computing yields practical technology. Physicists of the Max Planck Institute of Quantum Optics (MPQ), recently published research that, they believe, has brought that pivotal day closer.

For many years, physicists have sought to create an information network far superior to today's by exploiting quantum phenomena. The team of German researchers have constructed the first vital component of such a network: a link between two atomic nodes over which information can be received, sent, and stored using a single photon. Successful exchanges of information recently took place in Garching, Germany, between two MPQ labs connected by a 60-meter fiber-optic cable. Though only a prototype, this rudimentary network could be scaled up to more complex and distanced quantum networks. The team reports their research in Nature.

The idea of quantum computing was introduced by the physicist Richard Feynman in 1982. The essential unit of classical computing, the bit, is binary. Like a light switch, it's either on or off, 1 or 0. The quantum bit, by contrast, can be 1, 0, or a mix of both states – this last state being like a flipped coin that's still spinning in the air.

The usefulness of this extra dimension seems, at first pass, more confusing than anything else, but it actually creates an new opportunity to represent data. Whereas a trillion classical bytes can hold 243 discreet on/off values, a mere 200 quantum bits, or qubits, could represent at the very least 2200 discreet values. This new capacity would allow future computers to do involved calculations at nearly unthinkable speeds, and solve problems that are currently unsolvable. The technological implications are too many to list, which suggests why there's such excitement surrounding the field. [Editor's note: An earlier version got bits mixed up with bytes.]


Page:   1   |   2   |   3

Follow Stories Like This
Get the Monitor stories you care about delivered to your inbox.