The single arithmetic logic unit (ALUs), where the work takes place, in today’s computers are limited to doing one thing at a time so, the more complex the problem, the longer it takes. A really difficult calculation that requires more power and time than a computer can handle is called an ‘intractable’ problem. This is the kind of challenge quantum computers are expected to solve.
Existing digital computers encode data as bits which can be stored in one of two states: 0 or 1. Quantum computers use quantum bits, known as ‘qubits’, that are both 0 and 1 at the same time – and everything in between. Confused? Don’t worry, many quantum physicists find it troubling, but they accept it all the same. This counter-intuitive fact, together with the ability of qubits to share a quantum information by ‘entanglement’, allows quantum computers to perform many calculations simultaneously.
The core elements necessary for quantum computers include: a source of qubits, a memory for storing the qubits, quantum logic circuits for executing operations on them, and a means to read the computer’s outputs. The major challenge is in maintaining the quality of the inherently very delicate qubits for long enough to perform quantum computations on them. Qubits quickly lose their special quantum properties – within 100 micro seconds for superconducting qubits – due in part to ambient vibrations, temperature fluctuations, and electromagnetic waves. Isolation from this is essential.
A popular choice has been to encode qubits in micron-sized superconducting loops that exhibit quantum mechanical effects – the so-called superconducting qubits. These show the required ability to store quantum states for a sufficient time and to perform quantum-logic operations. Indeed, D-Wave Systems, a quantum technology company based in Canada, has built and sold a number of quantum-based computing machines based on this idea. Its new Pegasus chip, released in February 2019, has more than 5,000 qubits. While not a truly general-purpose quantum computer – it doesn’t use quantum logic gates – the D-Wave machine is designed to exploit quantum effects to solve optimisation problems with a speed and efficiency that opens up new possibilities.
IBM’s Q System One machine, announced in January this year, is a 20-qubit model and is described by the company as the world’s first computer using quantum logic gates. Access is only available via the IBM Cloud because of the computer’s special operational requirements and the company predicts that this will remain the dominant delivery mechanism for the foreseeable future. Customers include ExxonMobil and Cern, which intends to apply quantum machine learning techniques to analyse data from the Large Hadron Collider.
Apart from superconducting loops, another promising idea for qubit arrays is the use of cold, trapped ions or neutral atoms, which offer advantages in terms of qubit stability. IonQ, a US university spinout, aims to encode qubits in single ions held by vacuum traps in electric and magnetic fields, while ColdQuanta, based in the US and the UK, claims to be leading the field in cold atom quantum technology and have their sights on neutral atom qubits.
Microsoft is taking a different approach. It proposes encoding qubits using exotic states of matter called ‘anyons’. These qubits would, in theory, be more resistant to external noise and so make error correction easier. Although anyon qubits have yet to be created, Microsoft is investing heavily in expertise based at the University of Delft in the Netherlands to research the possibilities.
Exotic particles aside, Professor Robert Young, who co-founded a spinout from Lancaster University, called Quantum Base, that specialises in quantum security technology, predicts: “A truly disruptive qubit technology is likely to be based on silicon”.
Silicon is an attractive platform for qubits because of its compatibility with existing microelectronics circuits and chip manufacturing techniques. As with all qubit proposals, the problem is how the delicate quantum states can be kept alive for long enough to perform quantum operations. In Australia, researchers at the Centre for Quantum Computation and Communication Technology recently produced single-atom qubits embedded in a silicon lattice and controlled using electric fields. These have proven to be relatively robust to electrical noise, allowing them to hold their essential quantum states. By fabricating the chip with a dedicated control layer placed on top of lower layer of qubits, the 3D architecture aligns its qubits to error correction control lines – essentially very narrow wires – to constantly correct for errors in many qubits in parallel. Another advantage of this design is that the qubits are very small: literally atom sized. This could allow many qubits to be placed on a small chip, potentially up to two hundred times more than is possible with micron-sized superconducting qubits. Even those in the field, however, admit that silicon-based qubit technologies are playing catch-up with their superconducting qubit competitors.
Quantum computing is now a reality. Many have suggested the biggest remaining challenge is now not the science of quantum computing, but the engineering.
(Fearnside, A., 2019, Quantum Computing, Cambridge Wireless Journal, Vol. 2, Issue 4, p. 26-29).