Wednesday, October 21, 2009

Limit on rate of quantum computing

Levitin and Toffoli recently published a paper in Physics Review Letters titled "The fundamental limit on the rate of quantum dynamics: the unified bound is tight" [1]. The title is a pretty academic sounding, but the final sentence of the abstract sums it up perfectly: "These results establish the fundamental quantum limit on the rate of operation of any information-processing system." Hitting this limit will take decades if Moore's law keeps up however. There is a great summary on the paper here and you can find the entire paper on arXiv here [1].

It is impressive that this limit has been established, but there is still a lot of work ahead to reach this limit. Nonetheless we should be able to do impressive things regardless of this limit. Turing started establishing what we could compute in the 1930's [2], but that hasn't had a noticeable impact to the average person on what computing technology can accomplish. Think of all the possible improvements in algorithms as well. While there may be a limit on the speed of computation, algorithms may lend short cuts that render the limit a mute point. In quantum algorithms especially, we have quite a ways to go with only a handful of practical ones at present. Furthermore, these existing quantum algorithms have shown that previously hard problems can be made practical via quantum computation.

The name Toffoli will ring a bell to those familiar with quantum computing. The Toffoli gate, or controlled controlled not, is one of the fundamental gates. Furthermore the reversible equivalent of any classical computation can be constructed using Toffoli gates [3].

References
[1] L. B. Levitin and T. Toffoli, "The fundamental limit on the rate of quantum dynamics: the unified bound is tight," Physics Review Letters, vol. 103, 13Oct2009 2009.

[2] A. Turing, "On Computable Numbers, with an Application to Entscheid-ungsproblem," Proc. London Math Society, vol. 42, pp. 230-265, 1936.

[3] M. A. Nielsen and I. L. Chuang, Quantum Computation and Quantum Information, 1 ed. Cambridge, UK: Cambridge University Press, 2000.

2 comments:

  1. What Profs. Lev B. Levitin and Tommaso Toffoli's paper "Fundamental Limit on the Rate of Quantum Dynamics: The Unified Bound Is Tight" (Physical Review Letters, Vol. 103, Issue 16 [October 2009]; also at arXiv:0905.3417) demonstrates is that processor speed can diverge to infinity if the energy of the system diverges to infinity.

    Under the Margolus-Levitin theorem, the bound was given as t >= h/(4*E), with t being the minimum operation cycle in seconds, h being Planck's constant, and E being energy in joules. Levitin and Toffoli said paper tightens this bound to t >= h/(2*E) and generalizes it to all cases.

    With this new bound, one obtains ~ 3.31303448*10^-34 seconds as the minimum operation cycle per joule of energy; or for the reciprocal, a maximum of ~ 3.0183809*10^33 operations per second per joule of energy.

    So notice here that processor speed can increase without limit if the energy of the system is increased without limit. When the authors of the paper speak of a fundamental speed limit of computation, they are referring to per unit of energy.

    In the article "Computers Faster Only for 75 More Years: Physicists determine nature's limit to making faster processors" (Lauren Schenkman, Inside Science News Service, October 13, 2009), paper co-author Levitin is quoted as saying, "If we believe in Moore's law ... then it would take about 75 to 80 years to achieve this quantum limit." What Levin is referring to here is given the current energy-density of our present ordinary matter, processors cannot be made which have greater processing-density after around said time, i.e., one won't be able to fit more processing power within the same amount of space given the current energy-density of common matter. But even with the same energy-density, one can still increase processing speed by increasing the size or number of processors, yet they would then take up more space. As well, one can increase the processing-density without limit if one increases the energy-density without limit.

    In the same Inside Science article, Scott Aaronson, an assistant professor of electrical engineering and computer science at the Massachusetts Institute of Technology in Cambridge, is quoted as saying that what this bound means is that "we can't build infinitely fast computers," which is a misstatement of what the bound actually states. The bound actually states that one can build infinitely fast computers if one has an infinite amount of energy.

    For the cosmological limits to computation, see physicist and mathematician Prof. Frank J. Tipler's below paper, which demonstrates that the known laws of physics (i.e., the Second Law of Thermodynamics, general relativity, quantum mechanics, and the Standard Model of particle physics) require that the universe end in the Omega Point (the final cosmological singularity and state of infinite informational capacity identified as being God), and it also demonstrates that we now have the quantum gravity Theory of Everything (TOE):

    F. J. Tipler, "The structure of the world from pure numbers," Reports on Progress in Physics, Vol. 68, No. 4 (April 2005), pp. 897-964. http://math.tulane.edu/~tipler/theoryofeverything.pdf Also released as "Feynman-Weinberg Quantum Gravity and the Extended Standard Model as a Theory of Everything," arXiv:0704.3276, April 24, 2007. http://arxiv.org/abs/0704.3276

    See also the below resource:

    "Omega Point (Tipler)," Wikipedia, October 30, 2009. http://en.wikipedia.org/w/index.php?title=Omega_Point_%28Tipler%29&oldid=322843275

    ReplyDelete
  2. Pardon me, I incorrectly stated that the Levitin and Toffoli paper tightens the Margolus-Levitin bound. Rather, it generalizes it to all cases.

    ReplyDelete