The Weird World of Quantum Computing: Part 2

In the second article in this series on quantum computing, we examine the practical challenges of building quantum computers and their implications for actuaries.

 

How does quantum computing speed up calculations?

In computing, calculations – irrespective of the hardware on which they are made – can be classified in terms of difficulty by the number of steps (or, equivalently, time) required to complete the calculation.

Suppose we have a calculation with input length n. If the calculation can be completed in ≤ knp steps for every value of n, then it is said to be polynomial (complexity class P) and is regarded as tractable by classical computers. Calculations that require more steps or time than this are said to be exponential and are intractable.

If we have n qubits all in the down state

 | ψn >  =  | ↓, ↓, …, ↓>

and apply the Hadamard gate (introduced in article 1) to each, then we create a new state with 2n combinations – i.e., a linear combination of operators produces a state with an exponential number of values.

If we apply this concept to a calculation, each potential state can represent a solution to our problem, i.e., quantum superpositions (and entanglement) enable quantum parallel processing. Although, of course, if we measure the state, then we will get a single result with a certain probability, and the other states will be lost.

Actuarial calculations most likely to benefit from quantum parallelism are those that are computationally intensive and involve a large amount of data. Some examples include modelling the effects of climate change on insurance premiums at an individual household level; modelling optimal investment portfolios based on years of historical return data; or artificial intelligence.

Cybersecurity

Also of interest to actuaries are the implications of quantum computing on cybersecurity. Currently, secure communications across the public internet, for instance, between the banking app on your phone and the bank, rely on keys which are the product of two large prime numbers. Breaking the key requires factoring it into the two prime numbers, but this is a hard problem for classical computers to solve – which, of course, is why semiprime keys are used for encryption.

However, in 1994, Peter Shor developed a quantum algorithm that exponentially speeds up the factorisation.

If we could build a quantum computer with sufficient qubits, then we could break current keys in seconds, which is why national security agencies and financial services institutions are so interested in quantum computing.

We need to develop new encryption methods and other security measures – such as not exposing the transaction on the public internet – before quantum computing becomes a practical reality.

Approaches and challenges to building a working quantum computer

Up to this point, everything we have discussed has been theory developed by mathematicians and theoretical physicists, such as Richard Feynman, who first proposed the idea of quantum computing.

Incidentally, Feynman – co-winner of the Nobel Prize for his work on quantum electrodynamics – was a member of the commission that investigated the Challenger Disaster, and his character appeared playing his trademark bongos in the recent movie Oppenheimer!

But how do we go about building a working quantum computer? The field is still very experimental, and a wide range of approaches are currently being explored. The two physical properties which are exploited were discussed in article 1 – spin and polarisation.

The most well-known approaches in Australia are quantum electronics, pioneered by Professor Michelle Simmons and her group at UNSW, and that of US photonics company, PsiQuantum. Recently, the Commonwealth and Queensland governments invested A$940 million in PsiQuantum to build a quantum computer in Brisbane.

Photonics quantum computers are based on the polarisation of light (photons). Another approach utilises the spin of trapped ions. Ions are atoms that are missing or have an additional electron which can then be confined using electromagnetic fields and manipulated using lasers.

Unfortunately, qubits don’t just interact with each other, they interact with the wider environment – known in quantum computing as decoherence – which introduces errors into calculations.

Decoherence increases with temperature and time, both of which pose substantial challenges. Most current quantum computers operate at a temperature close to absolute zero (-273 Celsius) to reduce thermal noise and maintain the coherence of the qubits. The practical implication is that whilst qubits themselves may be small, the equipment currently needed to set up and manipulate them is large and expensive.

Another implication of decoherence is that error correction will be important for quantum computers. The no cloning theorem discussed in our first article means that many classical error correction techniques cannot be used but, fortunately, reversibility and entanglement can.

Currently, the most advanced quantum computers have 10s of qubits and are large and expensive. To be of practical use, they would need many times this number, perhaps millions of qubits.

There is a lot of investment going into research and development of quantum computers, but it is difficult to predict when a key breakthrough may occur. However, it will likely be a while before we have quPhones in our pockets.

In the meantime, as well as seeking more advanced encryption methods, financial institutions, government bodies and other entities that hold sensitive customer data will need to continue to improve cybersecurity to restrict unauthorised access.

CPD: Actuaries Institute Members can claim two CPD points for every hour of reading articles on Actuaries Digital.