iMessage is getting a major makeover that makes it among the two messaging apps most prepared to withstand the coming advent of quantum computing, largely at parity with Signal or arguably incrementally more hardened.
On Wednesday, Apple said messages sent through iMessage will now be protected by two forms of end-to-end encryption (E2EE), whereas before, it had only one. The encryption being added, known as PQ3, is an implementation of a new algorithm called Kyber that, unlike the algorithms iMessage has used until now, can’t be broken with quantum computing. Apple isn’t replacing the older quantum-vulnerable algorithm with PQ3—it’s augmenting it. That means, for the encryption to be broken, an attacker will have to crack both.
Making E2EE future safe
The iMessage changes come five months after the Signal Foundation, maker of the Signal Protocol that encrypts messages sent by more than a billion people, updated the open standard so that it, too, is ready for post-quantum computing (PQC). Just like Apple, Signal added Kyber to X3DH, the algorithm it was using previously. Together, they’re known as PQXDH.
iMessage and Signal provide end-to-end encryption, a protection that makes it impossible for anyone other than the sender and recipient of a message to read it in decrypted form. iMessage began offering E2EE with its rollout in 2011. Signal became available in 2014.
One of the biggest looming threats to many forms of encryption is quantum computing. The strength of the algorithms used in virtually all messaging apps relies on mathematical problems that are easy to solve in one direction and extremely hard to solve in the other. Unlike a traditional computer, a quantum computer with sufficient resources can solve these problems in considerably less time.
No one knows how soon that day will come. One common estimate is that a quantum computer with 20 million qubits (a basic unit of measurement) will be able to crack a single 2,048-bit RSA key in about eight hours. The biggest known quantum computer to date has 433 qubits.
Whenever that future arrives, cryptography engineers know it’s inevitable. They also know that it’s likely some adversaries will collect and stockpile as much encrypted data now and decrypt it once quantum advances allow for it. The moves by both Apple and Signal aim to defend against that eventuality using Kyber, one of several PQC algorithms currently endorsed by the National Institute of Standards and Technology. Since Kyber is still relatively new, both iMessage and Signal will continue using the more tested algorithms for the time being.
There’s a general consensus that performing any sort of complex algorithm on quantum hardware will have to wait for the arrival of error-corrected qubits. Individual qubits are too error-prone to be trusted for complex calculations, so quantum information will need to be distributed across multiple qubits, allowing monitoring for errors and intervention when they occur.
But most ways of making these “logical qubits” needed for error correction require anywhere from dozens to over a hundred individual hardware qubits. This means we’ll need anywhere from tens of thousands to millions of hardware qubits to do calculations. Existing hardware has only cleared the 1,000-qubit mark within the last month, so that future appears to be several years off at best.
But on Thursday, a company called Nord Quantique announced that it had demonstrated error correction using a single qubit with a distinct hardware design. While this has the potential to greatly reduce the number of hardware qubits needed for useful error correction, the demonstration involved a single qubit—the company doesn’t even expect to demonstrate operations on pairs of qubits until later this year.
Meet the bosonic qubit
The technology underlying this work is termed a bosonic qubit, and they’re not anything new; an optical instrument company even has a product listing for them that notes their potential for use in error correction. But while the concepts behind using them in this manner were well established, demonstrations were lagging. Nord Quantique has now posted a paper in the arXiv that details a demonstration of them actually lowering error rates.
The devices are structured much like a transmon, the form of qubit favored by tech heavyweights like IBM and Google. There, the quantum information is stored in a loop of superconducting wire and is controlled by what’s called a microwave resonator—a small bit of material where microwave photons will reflect back and forth for a while before being lost.
A bosonic qubit turns that situation on its head. In this hardware, the quantum information is held in the photons, while the superconducting wire and resonator control the system. These are both hooked up to a coaxial cavity (think of a structure that, while microscopic, looks a bit like the end of a cable connector).
Massively simplified, the quantum information is stored in the manner in which the photons in the cavity interact. The state of the photons can be monitored by the linked resonator/superconducting wire. If something appears to be off, the resonator/superconducting wire allows interventions to be made to restore the original state. Additional qubits are not needed. “A very simple and basic idea behind quantum error correction is redundancy,” co-founder and CTO Julien Camirand Lemyre told Ars. “One thing about resonators and oscillators in superconducting circuits is that you can put a lot of photons inside the resonators. And for us, the redundancy comes from there.”
This process doesn’t correct all possible errors, so it doesn’t eliminate the need for logical qubits made from multiple underlying hardware qubits. In theory, though, you can catch the two most common forms of errors that qubits are prone to (bit flips and changes in phase).
In the arXiv preprint, the team at Nord Quantique demonstrated that the system works. Using a single qubit and simply measuring whether it holds onto its original state, the error correction system can reduce problems by 14 percent. Unfortunately, overall fidelity is also low, starting at about 85 percent, which is significantly below what’s seen in other systems that have been through years of development work. Some qubits have been demonstrated with a fidelity of over 99 percent.
Getting competitive
So there’s no question that Nord Quantique is well behind a number of the leaders in quantum computing that can perform (error-prone) calculations with dozens of qubits and have far lower error rates. Again, Nord Quantique’s work was done using a single qubit—and without doing any of the operations needed to perform a calculation.
Lemyre told Ars that while the company is small, it benefits from being a spin-out of the Institut Quantique at Canada’s Sherbrooke University, one of Canada’s leading quantum research centers. In addition to having access to the expertise there, Nord Quantique uses a fabrication facility at Sherbrooke to make its hardware.
Over the next year, the company expects to demonstrate that the error correction scheme can function while pairs of qubits are used to perform gate operations, the fundamental units of calculations. Another high priority is to combine this hardware-based error correction with more traditional logical qubit schemes, which would allow additional types of errors to be caught and corrected. This would involve operations with a dozen or more of these bosonic qubits at a time.
But the real challenge will be in the longer term. The company is counting on its hardware’s ability to handle error correction to reduce the number of qubits needed for useful calculations. But if its competitors can scale up the number of qubits fast enough while maintaining the control and error rates needed, that may not ultimately matter. Put differently, if Nord Quantique is still in the hundreds of qubit range by the time other companies are in the hundreds of thousands, its technology might not succeed even if it has some inherent advantages.
But that’s the fun part about the field as things stand: We don’t really know. A handful of very different technologies are already well into development and show some promise. And there are other sets that are still early in the development process but are thought to have a smoother path to scaling to useful numbers of qubits. All of them will have to scale to a minimum of tens of thousands of qubits while enabling the ability to perform quantum manipulations that were cutting-edge science just a few decades ago.
Looming in the background is the simple fact that we’ve never tried to scale anything like this to the extent that will be needed. Unforeseen technical hurdles might limit progress at some point in the future.
Despite all this, there are people backing each of these technologies who know far more about quantum mechanics than I ever will. It’s a fun time.