Mar 6 2014
With cutting-edge technology, sometimes the first step scientists face is just making sure it actually works as intended.
The USC Viterbi School of Engineering is home to the USC-Lockheed Martin Quantum Computing Center (QCC), a super-cooled, magnetically shielded facility specially built to house the first commercially available quantum computing processors – devices so advanced that there are only two in use outside the Canadian lab where they were built: The first one went to USC and Lockheed Martin, and the second to NASA and Google.
Since USC's facility opened in October 2011, a key task for researchers has been to determine whether D-Wave processors operate as hoped – using the special laws of quantum mechanics to offer potentially higher-speed processing, instead of operating in a classical, traditional way.
An international collaboration of scientists has now published several papers rejecting classical models of the first-generation D-Wave One processor housed at USC, including one on an elaborate test of all 108 of the chip's functional quantum bits ("qubits"). The test demonstrates that the D-Wave One behaved in a way that agrees with a model called "quantum Monte Carlo," yet disagreed with two candidate classical models that could have described the processor in the absence of quantum effects.
The research was published on Feb. 28 by Nature Physics.
"The challenge is that the tests we can perform on the USC-based D-Wave processor can't directly 'prove' that the D-Wave processor is quantum – we can only disprove candidate classical models one at a time," said QCC Director Prof. Daniel Lidar. "But so far we find that the D-Wave processor is always consistent with our quantum models. Our tests continually get more rigorous and complex."
Add this to recent work involving USC Information Sciences Institute researcher Federico Spedalieri demonstrating entanglement in a chip at the company's headquarters in Burnaby BC as well as previous testing of a smaller group of qubits by Spedalieri, Lidar and their collaborators, and the evidence is mounting that quantum effects are at play in the D-Wave processors.
Quantum processors encode data in qubits, which have the capability of representing the two digits of one and zero at the same time – as opposed to traditional bits, which can encode distinctly either a one or a zero. This property, called superposition, along with the ability of quantum states to "interfere" (cancel or reinforce each other like waves in a pond) and "tunnel" through energy barriers, is what may one day allow quantum processors ultimately perform optimization calculations much faster than traditional processors.
Optimization problems can take many forms, and quantum processors have been theorized to be useful for a variety of big data problems like stock portfolio optimization, image recognition and classification, and detecting anomalies, such as rooting out bugs in complex software.
The first quantum chip housed at the QCC was a 128-qubit D-Wave One, which was replaced about a year ago with the 512-qubit D-Wave Two. Though every chip is unique, the repeated validation of the older chip bodes well for its successor, which shares the same architecture.
"Our work is part of a large scale effort by the research community aimed at validating the potential of quantum information processing, which we all hope might one day surpass its classical counterparts," Lidar said.