Four New Ways to Chill Computer Chips

index1

Things are getting a bit too hot in the microprocessor world. Again.

Moore’s Law has always come with the caveat that more transistors, switched at a higher frequency, means more heat. Over the years, chipmakers have used tricks like throttling back clock speeds and putting multiple microprocessor cores on a chip to spread out the heat.

But heat continues to stifle chip performance. Hot spots on today’s processors can reach power densities of 1 kilowatt per square centimeter, much higher than the heat inside a rocket nozzle. A growing fraction of transistors on advanced microprocessors are not even operated at any one time because they would generate too much heat, says Avram Bar-Cohen, a program manager at the microsystems technology office of DARPA. “As we put more and more transistors on them, this ‘dark silicon’ fraction has gone from 10 to 20 percent, in some cases more,” he says.

There’s only so much that processor designers can do to keep chips from generating too much heat, and it’s time for some new ways to get that heat out.

The conventional approach to dissipating the heat is to attach

Spy Agency Bets on IBM for Universal Quantum Computing

index2

A real-life U.S. version of “Q Branch” from the James Bond films has greater ambitions than creating personal spy gadgets such as exploding watches or weaponized Aston Martins. It’s betting on an IBM team to develop the first logical qubits as crucial building blocks for universal quantum computers capable of outperforming today’s classical computers.

Most quantum computing efforts have focused on building ever-larger arrays of quantum bits, called qubits, made from physical components such as superconducting loops of metal or charged atoms trapped within magnetic fields. Qubits can harness the weird power of quantum physics to exist in two states simultaneously and influence distant qubits through quantum entanglement, but the challenge comes from maintaining fragile quantum states long enough to perform computer calculations. As a next step, the U.S. Intelligence Advanced Research Projects Activity (IARPA) has given IBM a five-year research grant to assemble physical qubits into a single logical qubit that lasts long enough to perform complex computer operations.

“The idea is that the encoded logical qubit would last longer than individual physical qubits, so it could be part of a computation in a larger universal quantum computer,” says Jerry Chow, manager of experimental quantum computing at IBM’s Thomas J. Watson Research Center,

Australians Invent Architecture for a Full-Scale Silicon Quantum Computer

index3

It’s looking more and more like future super powerful quantum computers will be made of the same stuff as today’s classical computers: silicon. A new study lays out the architecture for how silicon quantum computers could scale up in size and enable error correction—crucial steps toward making practical quantum computing a reality.

All quantum computing efforts rely on “spooky” quantum physics that allows the spin state of an electron or an atom’s nucleus to exist in more than one state at the same time. That means each quantum bit (qubit) can represent information as both a 1 and 0 simultaneously. (A classical computing bit can only exist as either a 1 or a 0, but not a mix of both.) Previously, Australian scientists demonstrated single qubits based on both the spin of electrons and the nuclear spin of phosphorus atoms embedded in silicon. Their latest work establishes a quantum computing architecture that paves the way for building and controlling arrays of hundreds or thousands of qubits.

“As you start to scale the qubit architectures up, you have to move away from operating individual qubits