In 1981 the physicist Richard Feynman speculated about the possibility of “tiny computers obeying quantum mechanical laws.” He suggested that such a quantum computer might be the best way to simulate real-world quantum systems, a challenge that today is largely beyond the calculating power of even the fastest supercomputers. Since then there has been sporadic progress in building this kind of computer. The experiments to date, however, have largely yielded only systems that seek to demonstrate that the principle is sound. They offer a tantalizing peek at the possibility of future supercomputing power, but only the slimmest results.
Recent progress, however, has renewed enthusiasm for finding avenues to build significantly more powerful quantum computers. Laboratory efforts in the United States and in Europe are under way using a number of technologies. Significantly, I.B.M. has reconstituted what had recently been a relatively low-level research effort in quantum computing. I.B.M. is responding to advances made in the past year at Yale University and the University of California, Santa Barbara, that suggest the possibility of quantum computing based on standard microelectronics manufacturing technologies. Both groups layer a superconducting material, either rhenium or niobium, on a semiconductor surface, which when cooled to near absolute zero exhibits quantum behavior.
The company has assembled a large research group at its Thomas J. Watson Research Center in Yorktown Heights, N.Y., that includes alumni from the Santa Barbara and Yale laboratories and has now begun a five-year research project.
“I.B.M. is quite interested in taking up the physics which these other groups have been pioneering,” said David DiVincenzo, an I.B.M physicist and research manager.
Researchers at Santa Barbara and Yale also said that they expect to make further incremental progress in 2011 and in the next several years. At the most basic level, quantum computers are composed of quantum bits, or qubits, rather than the traditional bits that are the basic unit of digital computers. Classic computers are built with transistors that can be in either an “on” or an “off” state, representing either a 1 or a 0. A qubit, which can be constructed in different ways, can represent 1 and 0 states simultaneously. This quality is called superposition.
The potential power of quantum computing comes from the possibility of performing a mathematical operation on both states simultaneously. In a two-qubit system it would be possible to compute on four values at once, in a three-qubit system on eight at once, in a four-qubit system on 16, and so on. As the number of qubits increases, potential processing power increases exponentially.
There is, of course, a catch. The mere act of measuring or observing a qubit can strip it of its computing potential. So researchers have used quantum entanglement — in which particles are linked so that measuring a property of one instantly reveals information about the other, no matter how far apart the two particles are — to extract information. But creating and maintaining qubits in entangled states has been tremendously challenging.
“We’re at the stage of trying to develop these qubits in a way that would be like the integrated circuit that would allow you to make many of them at once,” said Rob Schoelkopf, a physicist who is leader of the Yale group. “In the next few years you’ll see operations on more qubits, but only a handful.”
The good news, he said, is that while the number of qubits is increasing only slowly, the precision with which the researchers are able to control quantum interactions has increased a thousandfold.
The Santa Barbara researchers said they believe they will essentially double the computational power of their quantum computers next year.
John Martinis, a physicist who is a member of the team, said, “We are currently designing a device with four qubits, and five resonators,” the standard microelectronic components that are used to force quantum entanglement. “If all goes well, we hope to increase this to eight qubits and nine resonators in a year or so.”
Two competing technological approaches are also being pursued. One approach involves building qubits from ions, or charged atomic particles, trapped in electromagnetic fields. Lasers are used to entangle the ions. To date, systems as large as eight qubits have been created using this method, and researchers believe that they have design ideas that will make much larger systems possible. Currently more than 20 university and corporate research laboratories are pursuing this design.
In June, researchers at Toshiba Research Europe and Cambridge University reported in Nature that they had fabricated light-emitting diodes coupled with a custom-formed quantum dot, which functioned as a light source for entangled photons. The researchers are now building more complex systems and say they can see a path to useful quantum computers.
A fourth technology has been developed by D-Wave Systems, a Canadian computer maker. D-Wave has built a system with more than 50 quantum bits, but it has been greeted skeptically by many researchers who believe that it has not proved true entanglement. Nevertheless, Hartmut Neven, an artificial-intelligence researcher at Google, said the company had received a proposal from D-Wave and NASA’s Jet Propulsion Laboratory to develop a quantum computing facility for Google next year based on the D-Wave technology.
Among other important tasks of quantum photonics there is a problem of building an interface between atoms and light photons. Joint think-tank of physicists from Russia and United States studies how single photons interact with quantum objects and has already built a prototype device for transferring information from an atom to a photon. The device is an integrated chip in an artificial atom with a fibre port. The simplest application of future device is a memory cell, as for more complicated things, this device can become a single-photon transistor, working on the level of individual quanta, for building complex logic systems.
Main aim of the group is building an interface between light and atoms or artificial atoms. An interface is a way of effectively transferring information from one object to another. Building such an interface means learning to create some given kind of a state in a system (such state is called superposition), read it without damaging and transfer to another object. Mentioned given state – a superposition of two or more atom’s energy states, which can be registered with some certain probability – is what researchers call quantum information. This information is transferred to light (an individual photon serves as a carrier), which in its turn can be reliably detected – information on a photon can be “read”. In other words, there exists a channel, through which quantum information is transferred from memory cell to another cell or an out port.
A system, working with quantum information, can be “conveniently” built on atoms, which can be universal memory cells for superposition – it has only weak interactions with an environment and can store information for some period of time. The think-tank, we are talking about in this article, has already created an interface with artificial atoms – quantum dots and colour centres in a diamond. These are structures in diamond’s crystal lattice, where a carbon atom is replaced with a nitrogen atom. These structures have a position of energy levels similar to that of an atom, and can host state of superposition. Artificial atoms, especially colour centres in a diamond, have a long-storage memory, having a nucleus spin’s lifetime of about a second. Nothing like a hard drive, of course, but enough for random access memory, since all operations require only microseconds for being performed.
A device, developed by Russian and American researchers, is a chip made of silicon. The chip hosts an artificial atom with a diamond crystal, containing a colour centre. Mentioned crystal, 50x50 nanometres in size, is sitting on a silver wire (100 nm in diameter), combined with a light-conducting dielectric waveguide. All work is performed at room temperature, using specially designed confocal microscope for observations. One channel of the microscope shows sample image for choosing desired object and an interesting spot on this object. The laser radiation is focused on this spot, after that colour centre produces individual photons, registered during an experiment. Another channel of the microscope scans sample’s environment and collects information from any glowing spot – no matter if it is a waveguide’s end, or a wire’s end. Excitation beam can move along the sample and collect radiation from different colour centers.
Researchers successfully created an interface between a photon and a quantum object (an atom) – they have a technique of creating stable working chips, they can register individual photons and calculate correlation functions. However, scientists can register only 60% of photons, but no one can currently register more. Authors of the research claim that they know how to fix this problem. Further studies are aimed at making a resonator out of the wire – this function raises the chances of interaction between atom and light, and chances of radiation getting into the wire, correspondingly.
As for practical applications of a new research – there are plenty, from memory cells to transistors and complex elements of single-photon logic. New technologies for communication and quantum computers are closer, than we think.
CAMBRIDGE, Mass. -- In an important first for a promising new technology, scientists have used a quantum computer to calculate the precise energy of molecular hydrogen. This groundbreaking approach to molecular simulations could have profound implications not just for quantum chemistry, but also for a range of fields from cryptography to materials science.
"One of the most important problems for many theoretical chemists is how to execute exact simulations of chemical systems," says author Alán Aspuru-Guzik, assistant professor of chemistry and chemical biology at Harvard University. "This is the first time that a quantum computer has been built to provide these precise calculations."
The work, described this week in Nature Chemistry, comes from a partnership between Aspuru-Guzik's team of theoretical chemists at Harvard and a group of experimental physicists led by Andrew White at the University of Queensland in Brisbane, Australia. Aspuru-Guzik's team coordinated experimental design and performed key calculations, while his partners in Australia assembled the physical "computer" and ran the experiments.
"We were the software guys," says Aspuru-Guzik, "and they were the hardware guys."
While modern supercomputers can perform approximate simulations of simple molecular systems, increasing the size of the system results in an exponential increase in computation time. Quantum computing has been heralded for its potential to solve certain types of problems that are impossible for conventional computers to crack.
Rather than using binary bits labeled as "zero" and "one" to encode data, as in a conventional computer, quantum computing stores information in qubits, which can represent both "zero" and "one" simultaneously. When a quantum computer is put to work on a problem, it considers all possible answers by simultaneously arranging its qubits into every combination of "zeroes" and "ones."
Since one sequence of qubits can represent many different numbers, a quantum computer would make far fewer computations than a conventional one in solving some problems. After the computer's work is done, a measurement of its qubits provides the answer.
"Because classical computers don't scale efficiently, if you simulate anything larger than four or five atoms -- for example, a chemical reaction, or even a moderately complex molecule -- it becomes an intractable problem very quickly," says author James Whitfield, research assistant in chemistry and chemical biology at Harvard. "Approximate computations of such systems are usually the best chemists can do."
Aspuru-Guzik and his colleagues confronted this problem with a conceptually elegant idea.
"If it is computationally too complex to simulate a quantum system using a classical computer," he says, "why not simulate quantum systems with another quantum system?"
Such an approach could, in theory, result in highly precise calculations while using a fraction the resources of conventional computing.
While a number of other physical systems could serve as a computer framework, Aspuru-Guzik's colleagues in Australia used the information encoded in two entangled photons to conduct their hydrogen molecule simulations. Each calculated energy level was the result of 20 such quantum measurements, resulting in a highly precise measurement of each geometric state of molecular hydrogen.
"This approach to computation represents an entirely new way of providing exact solutions to a range of problems for which the conventional wisdom is that approximation is the only possibility," says Aspuru-Guzik.
Ultimately, the same quantum computer that could transform Internet cryptography could also calculate the lowest energy conformations of molecules as complex as cholesterol.
Aspuru-Guzik and Whitfield's Harvard co-authors on the Nature Chemistry paper are Ivan Kassal, Jacob D. Biamonte, and Masoud Mohseni. Financial support was provided by the US Army Research Office and the Australian Research Council Federation Fellow and Centre of Excellence programs. Aspuru-Guzik recently received support from the DARPA Young Investigator Program, the Alfred P. Sloan Foundation, and the Camille and Henry Dreyfus Foundation to pursue research towards practical quantum simulators.
While work continues on developing the fundamentals for super-fast quantum computers, a group of researchers has shown that, at least for some sorts of problems, classical computing could match the eventual speed of a working quantum computer -- with the correct software algorithms in place.
"We're putting lots of money into building quantum computers, but we shouldn't underestimate the power of algorithms," said John Watrous, who works at the Institute for Quantum Computing at the University of Waterloo at Ontario, Canada.
As a by-product of studying the predicted performance of quantum computing, Watrous and other researchers have shown how an algorithm little used in today's software could provide a new level of problem-solving performance in traditional computers, one that could match, in theory anyway, speeds obtained by quantum computers.
Their work was published in the latest edition of the Communications of the ACM, the flagship publication of the Association for Computing Machinery.
"One striking implication of this characterization is that it implies quantum computing provides no increase in computational power whatsoever over classical computing in the context of interactive proof systems," the paper notes.
In June, an earlier version of this paper won the Best Paper Award at the esteemed Symposium on Theory of Computing for 2010. The award shows that the work has major implications for the field of computer science, especially given that STOC judges rarely award quantum computing work, noted Scott Aaronson, an associate professor of electrical engineering and computer science at Massachusetts Institute of Technology, who was not involved in the work.
Quantum computing is often touted as the next stage of computer technology, one that could offer large-scale performance improvements after Moore's Law has been exhausted.
Taking advantage of the properties of quantum mechanics, a quantum computer could conceivably offer "exponential parallelism" in the aid of solving problems, Aaronson points out in a commentary accompanying Watrous' paper.
No quantum computers have been built yet, though companies such as IBM are beginning to develop the basic building blocks that could one day make such a computer.
The work of Aaronson and his colleagues seemingly settles a debate over whether or not one group of mathematical problems, called quantum interactive proof systems, are more or less difficult to solve than another set of problems, called classical interactive proof systems.
They are not, the paper asserts. But, due to the fact that these sets of problems are theoretical, the finding itself says little about quantum computing, beyond its ability to solve such abstract problems, Watrous admitted.
In order to set up the study, however, the researchers used an algorithm to evaluate potential speed in classical computation. Called the matrix multiplicative weights update method, it was developed from research in two mathematical fields of study, combinatorial optimization and learning theory.
The algorithm provided a way to solve a problem using parallel processes, the kind easily executable with today's multi-core processors and computer clusters. It provided a way to match the efficiency of quantum computing, for this set of problems.
Surprisingly, this matrix-based method hasn't been applied to parallel computing before, Watrous said.
"It has never been considered to my knowledge in a parallel setting," he said of the method. "We had to show that this method could be parallelized, and we couldn't find any reference to anyone doing that."
Watrous, while stating that he does not work in the commercial field of computer science, speculates that more work could be done in finding and adopting other mathematical algorithms that could speed the computational performance of hardware available today.
"We could try to build quantum computers to solve problems but we could also just design new algorithms to solve problems," he said.
Aaronson said that the algorithm could be used in commercial fields of computing, particularly in the field of semi-definite programming, which looks at ways of solving optimization problems. "These are very common in industrial optimization," he said.
The researchers showed that "for a certain class of semi-definite programs you can get not the exact answer but a very good approximate answer, using a very small amount of memory," he said.
London: Scientists have reported a major achievement in the field of quantum computers – they have succeeded in controlling the building blocks of a future super-fast quantum computer.
Scientists from the Kavli Institute of Nanoscience at Delft University of Technology and Eindhoven University of Technology are now able to manipulate these building blocks (qubits) with electrical rather than magnetic fields.
A qubit is the building block of a possible, future quantum computer, which would far outstrip current computers in terms of speed. It can adopt the states ''0'' and ''1'' by using the spin of an electron, which is generated by spinning the electron on its axis.
Until now, the spin of an electron has been controlled by magnetic fields but can now be controlled by a charge or an electric field.
"These spin-orbit qubits combine the best of both worlds. They employ the advantages of both electronic control and information storage in the electron spin,” said Leo Kouwenhoven at the Kavli Institute of Nanoscience at TU Delft.