Qubits and oracles: The strange world of quantum computing

Editor’s note: This article was initially published in The Daily Gazette, Swarthmore’s online, daily newspaper founded in Fall 1996. As of Fall 2018, the DG has merged with The Phoenix. See the about page to read more about the DG.

The second Sigma Xi of the year, entitled “Strange Bits: Quantum Computing and the Search for New Quantum Algorithms,” was presented yesterday. Swarthmore Professor Lisa Meeden introduced Lee Spector, a Professor of Computer Science at the School of Cognitive Science at Hamshire College. The talk focused on describing quantum computing and what makes it promising as well as Spector’s work on genetic algorithms.

Spector began by defining quantum computing as information processing at such a level that atomic scale dynamics must be taken into effect–in fact, they become advantageous. The familiar bits of classical computing are replaced with unusual entities called qubits (“quantum bits”). Qubits can carry much more information than a single bit can via superposition. Though physicists still have a long way to go before quantum computers are a reality, Spector cited Moore’s law to show the immediacy of this technology: Storage capacity of silicon chips has doubled approximately every eighteen years. At that rate we will reach the quantum level soon after 2010.

The promise of quantum computing arises from its potential processing speed. Spector presented the hypothetical task of factoring a 5,000 digit number. This would take classical computing approximately 5 trillion years, but a quantum computer would finish in nearly 2 minutes. This speed comes out of the notoriously unusual nature of particles in quantum physics. As Spector repeatedly stated, “In quantum computing, possibilities matter.” This refers to how potential paths of electrons (or other waves) can cancel each other out, even if no electron is taking that path. In order to harness this effect various set-ups can be used, but Spector used beam splitters as the prime example in his talk. The effects of these counterintuitive properties can only be harnessed when no information about the particle’s path can escape; that is to say that no one is observing it.

Because the entangled nature of quantum computing is so difficult to understand, Spector, a non-physicist, decided that the best way to program a quantum computer would be to let computers figure it out. He designed a program that would create quantum algorithms, test them out, and randomly modify them. The program then compared algorithms to see which was closer to the desired function. Eventually an appropriate program evolves, thus the term genetic algorithms.

Spector has already been able to duplicate the work of physicists and even improve on certain problems. In one example, an evolved program was able to tell in which of four positions a hypothetical block was. This part of the gate array about which quantum effects are used to gather information is called an oracle. In classical computing this would take as much as three tries before the block was discerned to be in the last position. By taking into account all the possible paths of the electron in superposition, the evolved algorithm needed only one try. In questions it was revealed how this field is significantly limited by the necessity of modeling quantum computation on a classical computer. Also, the genetic algorithm approach often creates overcomplicated programs that are hard to simplify by, say, deleting unnecessary parts. Spector’s lecture shed light on the extremely murky world of quantum computing and showed how it isn’t necessary to completely understand quantum physics to begin programming. As he quoted from Niels Bohr, “Anyone who is not shocked by quantum theory has not understood it.”

Leave a Reply

Your email address will not be published.

The Phoenix

Discover more from The Phoenix

Subscribe now to keep reading and get access to the full archive.

Continue reading