Quantum Computing
Quantum Computing
Introduction:
With the development of science and technology, leading to the advancement of civilization,
new ways were discovered exploiting various physical resources such as materials, forces and
energies. The history of computer development represents the culmination of years of
technological advancements beginning with the early ideas of Charles Babbage and eventual
creation of the first computer by German engineer Konard Zeise in 1941. The whole process
involved a sequence of changes from one type of physical realization to another from gears to
relays to valves to transistors to integrated circuits to chip and so on. Surprisingly however,
the high speed modern computer is fundamentally no different from its gargantuan 30 ton
ancestors which were equipped with some 18000 vacuum tubes and 500 miles of wiring.
Although computers have become more compact and considerably faster in performing their
task, the task remains the same: to manipulate and interpret an encoding of binary bits into a
useful computational result. The number of atoms needed to represent a bit of memory has
been decreasing exponentially since 1950. An observation by Gordon Moore in 1965 laid the
foundations for what came to be known as “Moore’s Law” – that computer processing power
doubles every eighteen months. If Moore’s Law is extrapolated naively to the future, it is
learnt that sooner or later, each bit of information should be encoded by a physical system of
subatomic size. As a matter of fact this point is substantiated by the survey made by Keyes in
1988 as shown in fig. 1. This plot shows the number of electrons required to store a single bit
of information. An extrapolation of the plot suggests that we might be within the reach of
atomic scale computations with in a decade or so at the atomic scale however.
Matter obeys the rules of quantum mechanics, which are quite different from the classical
rules that determine the properties of conventional logic gates. So if computers are to become
smaller in future, new, quantum technology must replace or supplement what we have now.
Not withstanding, the quantum technology can offer much more than just cramming more
and more bits to silicon and multiplying the clock speed of microprocessors. It can support
entirely a new kind of computation with quantitatively as well as qualitatively new
algorithms based on the principles of quantum mechanics. With the size of components in
classical computers shrinking to where the behaviour of the components, is practically
dominated by quantum theory than classical theory, researchers have begun investigating the
potential of these quantum behaviours for computation. Surprisingly it seems that a computer
whose components are all to function in a quantum way are more powerful than any classical
computer can be. It is the physical limitations of the classical computer and the possibilities
for the quantum computer to perform certain useful tasks more rapidly than any classical
computer, which drive the study of quantum computing. A computer whose memory is
exponentially larger than its apparent physical size, a computer that can manipulate an
exponential set of inputs simultaneously – a whole new concept in parallelism; a computer
that computes in the twilight (space like) zone of Hilbert Space (or possibly a higher space –
Grassman Space & so on), is a quantum computer. Relatively few and simple concepts from
quantum mechanics are needed to make quantum computers a possibility. The subtlety has
been in learning to manipulate these concepts. If such a computer is inevitability or will it be
too difficult to build on, is a million dollars question.