Previous section: Encryption standards and Bullrun

In the foreseeable future, some standard encryption methods could become obsolete thanks to a brand-new technology. **Quantum computing **takes place on the atomic and sub-atomic scale and is still at the experimental stage. It aims to take advantage of some frankly mind-blowing properties of the particles that form the building blocks of matter and light and works in a completely different way from the classical electronics-based computing with which we are all familiar.

A bit in a classical computer is set to either 0 or 1 at any given moment in time. A classical computer program might be written to add two whole numbers, each of which has to be between zero and fifteen. Storing a number between zero and fifteen requires four bits (two to the power of four is sixteen), so storing both numbers would require eight bits. The values of each of the eight bits at the point in time when the program carried out the addition would determine the result.

Contrary to everything common sense tells us, a bit in a quantum computer, which is called a **qubit**, can have both values – 0 and 1 – simultaneously. A quantum program might take eight qubits as its input. Because each qubit has two values at once, eight qubits together have 256 concurrent values (two to the power of eight). The quantum program would effectively perform 256 calculations at the same time.

Unfortunately, the fact that each qubit in a quantum computation has both possible values at once does not mean that the results of a huge number of mathematical calculations can all be obtained using a single quantum computation. Rather than saying that each qubit has *both* values, it would perhaps be more accurate to say that each qubit has *either* value with a given probability. As long as a computation is taking place, each qubit really is set to 0 and to 1 at the same time. However, as soon as each of the qubits that makes up the result of the computation is read, the act of observing the qubit makes it stick in one of these two values. Retrieving the result of a quantum computation yields the result of only one of the many calculations that were performed. None of the other results remains accessible.

You may well ask what the use of the answer to a mathematical calculation is if there is no way of choosing the question. The crucial point is that some of the operations used in quantum computing can skew the probability with which each qubit has one or the other value. Such operations can be cleverly combined to make it likely that the result retrieved from a computation will be the answer to a specific question chosen by the programmer.

It may be helpful to compare the eight-qubit quantum computer with its classical counterpart and imagine it adding in parallel each of the members of a complete range of whole numbers between zero and fifteen to each of the members of a second, identical range of numbers. However, this analogy misrepresents the way quantum computing works.

In quantum computing, it is not just the storage of information that is revolutionary. The simplest building blocks that a classical computer uses when it runs a program are based on interactions between flows of electric current. A quantum computer, on the other hand, makes individual physical particles interact with one another in ways that are themselves unlike anything in our everyday experience. While it is certainly possible to write a quantum program to add two numbers, the steps that would be used to do so are completely different from the ones somebody programming a classical computer would have at their disposal.

In short, a quantum program is not just lots of classical programs operating in parallel. Because quantum computing and classical computing operate in totally dissimilar fashions, they tend to be good at different things. A quantum computer would not be an appropriate tool to solve the simple arithmetic at which classical computers excel, while the new mechanisms it offers can be exploited to achieve quick fixes for some mathematical problems that classical computers can only solve using brute-force methods. In many cases, these are the very mathematical problems on which today’s encryption standards are based.

It turns out that quantum computing would make cracking contemporary symmetric encryption methods easier, but that the advantage could be counterbalanced by doubling the number of bits used in each key to increase the number of possible values. For the asymmetric methods that use private / public keys, on the other hand, quantum computing would pose a much more serious problem. It would provide easy solutions to the mathematical problems on which all the current asymmetric standards are based. In the right circumstances, a quantum computer could allow its owner to find out other people’s private keys. The private / public encryption system would no longer serve its purpose.

Although practical research has certainly confirmed the theory behind quantum computing, none of the experimental quantum computers built so far have been able to use more than a very small number of qubits, nor have they worked well enough to be able to solve any mathematical problems more rapidly than the fastest classical computers. Nonetheless, it is probably only a matter of time until the remaining engineering problems are satisfactorily solved and the technology becomes mature enough for practical use.

New methods of encryption and decryption will probably emerge that can only be carried out using quantum technology. For the time being, however, the race is on to develop and standardise **quantum-resistant** asymmetric encryption techniques. These will be performed on classical computers just like the methods that are in use today. At the same time, they will rely on mathematical problems that a quantum computer would not be able to solve in a trivial fashion, which will provide assurance that the encodings they provide will not be open to analysis by quantum computers at some point in the future.

Tweet about quantum encryption |

Next section: Government restrictions on encryption