Quantum computing could bring about the biggest revolution
in computing since the invention of the modern digital computer.
For some challenging problems which would take the fastest
supercomputers years to solve, large-scale quantum computers would
theoretically be able to find solutions in days, or even hours. It could have a
tremendous impact on human society, helping accelerate cancer research or
addressing complex global challenges like climate change.
But what is quantum computing?
At the subatomic level, the laws of classical physics no
longer apply. Particles can exist in more than one state at a time and
phenomena such as entanglement and superposition  are exhibited. Quantum computing utilises these
quantum-mechanical phenomena to perform operations on data.
Quantum computing derives its power from being able to take
advantage of wavelike interference of a very large number of states. Whereas a
classical bit can be in one of two states, 0 or 1, a single qubit or quantum
bit can represent a 1, a 0 or any quantum superposition of those two qubit
states. When we measure to find out what state the qubits are in at any given
time, the qubits "collapse" into one of the possible states, giving
the answer to the problem.
At EmTech Asia 2018,
OpenGov sat down with Dr Joseph Fitzsimons, Assistant Professor, Engineering
Product Development at the Singapore University of Technology and Design (SUTD)
and Principal Investigator at the Centre for Quantum Technologies (CQT) to learn more.
Dr Fitzsimons is a theoretical physicist with interests in
all areas of quantum mechanics and quantum information theory. He talked about
the importance of high quality qubits and applications of quantum computing
that might be seen in the near future.
Not just quantity,
One of the obstacles to the development of functional, large-scale
quantum computers is errors. (This
article from Quanta Magazine presents an overview of the problem with errors in quantum computing.)
Quantum information is fragile and highly sensitive to
unavoidable noise. Random fluctuations, can occasionally flip or randomise the
state of a qubit, potentially derailing a calculation. Even the very fact,
that the quantum computer has to interact with the outer world so that a user
can run programs on it and get the output, introduces errors into the
computation and leads to loss of information. Moreover, superpositions collapse
to a definite value once they are measured. So, how do we even find out if a
qubit has an error? This is a challenging problem that scientists are trying to
Dr Fitzsimons explained, “It’s been very clear in the
community for a very long time that we don’t just need a lot of qubits, we need
good quality qubits, so that the error rate is sufficiently low, and we can
correct errors on the fly, within the device.”
To build functional quantum computers, the errors have to be
within a certain threshold. Every operation needs to have an error rate less
than about 1% for error correction to be possible. For correction to be
efficient, it needs to be significantly lower than this threshold.
Hence, for a long time, the focus was not so much on
increasing the number of qubits, but on getting to better qubits.
Dr Fitzsimons used the analogy of a faulty pen. “If the pen
is running out ink halfway through characters, the nib just isn’t working
properly, some of your writing is only coming out as scratches on the paper,
instead of ink marks, then that’s not really a useful pen. You don’t need a larger
notebook. You need a better pen,” he said.
Sometime around 2012, the precision with which people could
manipulate qubits improved and the levels of noise in them decreased to a point
comparable to the threshold level. In view of this development, there was
increased focus on development of larger systems.
Consequently, during the last 18 months, there has been
significant growth in the number of qubits people are putting into
Dr Fitzsimons also highlighted that different technologies
are being pursued in the area.
“Most of the growth recently has been in superconducting
qubits. Ion traps are a more mature technology. But they hit a scaling barrier
at around 10 qubits or 15 qubits. It becomes harder to control them and you
need to change the way you build the device. So, they are trying to overcome
this barrier and move to larger and larger systems. But they have really good
control of their qubits.”
“With superconducting qubits, the control has improved
dramatically. At this point, it seems they have a clear route to scale up to
maybe hundreds or thousands of qubits, maybe not millions, before they hit a
barrier. But we are in a regime now where there might well a lot of interesting
things we can do in the range before we hit the next barrier to be overcome.”
computing can provide an advantage
If a number of quantum computers are networked together so
that they pass quantum information between each other and are sharing quantum
states, then they can solve certain distributed computing problems with less
Quantum computing also offers advantages in terms of
security. One application is Quantum Key Distribution (QKD) which utilises
quantum entanglement to produce a shared secret key which can then be used to
encrypt and decrypt messages, ensuring that they can be deciphered only by
authorised individuals or entities. This is a mature technology and commercial
QKD systems are already available.
But security applications aren’t limited to QKD. Part of Dr
Fitzsimons’s research focuses on secure computing.
“If you are accessing a remote computer and you are running
code on it, if the remote computer’s a quantum computer you can keep your
computation completely hidden from it and you can check the results,” he said.
To do this, a device is required which can produce single
quantum states and send them. Dr Fitzsimons said that if taking a laser pointer
and place bin-liner in front of it, so that very little of the light passes
through, and then put polarising 3D glasses from the movie theatres, that is
almost enough to serve as the device.
On the server side, random states are received at the start
of the computation to be used as an input.
“Quantum states have this interesting property that it’s not
possible to distinguish between certain kinds of states. If you are producing
these random states and sending them to the server, the server cannot really
tell what states they received.”
Once these states are incorporated into the computation,
there’s a kind of back and forth process, where the server performs some
operations, takes a measurement and returns the result back to the user and the
user says what to do next.
From the point of view of the server, the instructions were
entirely random, but it is still able to process them because the random number
being communicated classically cancels out with part of the randomness in the
initial state. But because the server does not know what the initial state was
it cannot see how this cancellation is happening.
In addition to the above, a quantum computer, if we can
build one, would be much more efficient than a classical computer for solving
specific problems, such as breaking certain codes, or simulating chemistry.
However, a system with around 50-60 potentially noisy qubits (IBM announced
a 50 qubit computer last year and recently, Google released
a 72 qubit computer) has nowhere near enough memory to do most tasks.
So, what can kind of problems can be solved with quantum computers
available today or which might be available in the near future?
The problems that can be best tackled currently are the ones
that map most directly to the types of operations that the hardware implements.
“You are trying to get the most you can out of the couple of
qubits you have. So, you don’t want to have a big overhead from the encoding,”
Dr Fitzsimons said.
For instance, the dynamics of the quantum computer look a
bit like what’s being experienced by a molecule. So, it can be used for
chemistry-related problems. Optimisation problems are another area where we
might see implementation.
“But if you have a 50 qubit processor, then the kinds of
optimisation problems you might care about are going to have to only be
optimised over 50 bits. We will need larger processors before we can start to
encode more general problems,” he cautioned.
If the noise level is sufficiently low, there are potential
machine learning and linear algebra applications as well.
But machine learning applications start to become interesting
only once we have extra memory that can be accessed. For most machine learning
tasks, the system needs to train off a dataset and that data has to be stored
in memory at some point.
For most quantum algorithms to work, the quantum system
would have to be able to read from that memory. However, today we do not have
quantum memories like that.
Some machine learning applications may still be possible,
where the system is not learning from data but trying to learn how to do things
like approximate a function. However, those kinds of applications are also
highly likely to be intolerant to noise; error correction will be required
before they can be implemented.
today ≈ Digital computing in 1950
Dr Fitzsimons compared the current state of quantum
computing to the state of digital computing around 1950.
He explained, “We had the first devices which could do
things which a roomful of people couldn’t do. The equivalent to that is we are
starting to have devices that can probably do things that even a classical
supercomputer cannot do. Although that hasn’t yet been demonstrated, it should
be demonstrated probably within the next 12 months. So, they will outperform
classical computers for certain tasks. But we don’t yet know how many of those tasks
are going to be interesting from the point of view of real-world applications.”
An example would be simulations of random quantum circuits.
Like electrical circuits, there are quantum circuits that describe logic
operations on quantum computers. Researchers can make a random chain of those
and try to predict what the outcome should be. Quantum computers offer an
advantage, because there is very little structure in the problem and it’s hard
to come up with any algorithm that can solve it better than simulating it on a
Some of the niche applications might become real soon. But
it is difficult to predict when quantum computing will be able to move on to
more mainstream applications or start solving business problems. It will depend
on improvements in the levels of noise and the number and quality of qubits.
occurs when pairs or groups of particles are generated or interact in ways such
that the quantum state of each particle (such as the polarisation of a photon)
cannot be described independently of the others, even when the particles are
separated by a large distance. Superposition states that any two (or more)
quantum states can be added together and the result will be another valid