quantum mechanics

quantum-computing-progress:-higher-temps,-better-error-correction

Quantum computing progress: Higher temps, better error correction

conceptual graphic of symbols representing quantum states floating above a stylized computer chip.

There’s a strong consensus that tackling most useful problems with a quantum computer will require that the computer be capable of error correction. There is absolutely no consensus, however, about what technology will allow us to get there. A large number of companies, including major players like Microsoft, Intel, Amazon, and IBM, have all committed to different technologies to get there, while a collection of startups are exploring an even wider range of potential solutions.

We probably won’t have a clearer picture of what’s likely to work for a few years. But there’s going to be lots of interesting research and development work between now and then, some of which may ultimately represent key milestones in the development of quantum computing. To give you a sense of that work, we’re going to look at three papers that were published within the last couple of weeks, each of which tackles a different aspect of quantum computing technology.

Hot stuff

Error correction will require connecting multiple hardware qubits to act as a single unit termed a logical qubit. This spreads a single bit of quantum information across multiple hardware qubits, making it more robust. Additional qubits are used to monitor the behavior of the ones holding the data and perform corrections as needed. Some error correction schemes require over a hundred hardware qubits for each logical qubit, meaning we’d need tens of thousands of hardware qubits before we could do anything practical.

A number of companies have looked at that problem and decided we already know how to create hardware on that scale—just look at any silicon chip. So, if we could etch useful qubits through the same processes we use to make current processors, then scaling wouldn’t be an issue. Typically, this has meant fabricating quantum dots on the surface of silicon chips and using these to store single electrons that can hold a qubit in their spin. The rest of the chip holds more traditional circuitry that performs the initiation, control, and readout of the qubit.

This creates a notable problem. Like many other qubit technologies, quantum dots need to be kept below one Kelvin in order to keep the environment from interfering with the qubit. And, as anyone who’s ever owned an x86-based laptop knows, all the other circuitry on the silicon generates heat. So, there’s the very real prospect that trying to control the qubits will raise the temperature to the point that the qubits can’t hold onto their state.

That might not be the problem that we thought, according to some work published in Wednesday’s Nature. A large international team that includes people from the startup Diraq have shown that a silicon quantum dot processor can work well at the relatively toasty temperature of 1 Kelvin, up from the usual milliKelvin that these processors normally operate at.

The work was done on a two-qubit prototype made with materials that were specifically chosen to improve noise tolerance; the experimental procedure was also optimized to limit errors. The team then performed normal operations starting at 0.1 K, and gradually ramped up the temperatures to 1.5 K, checking performance as they did so. They found that a major source of errors, state preparation and measurement (SPAM), didn’t change dramatically in this temperature range: “SPAM around 1 K is comparable to that at millikelvin temperatures and remains workable at least until 1.4 K.”

The error rates they did see depended on the state they were preparing. One particular state (both spin-up) had a fidelity of over 99 percent, while the rest were less constrained, at somewhere above 95 percent. States had a lifetime of over a millisecond, which qualifies as long-lived int he quantum world.

All of which is pretty good, and suggests that the chips can tolerate reasonable operating temperatures, meaning on-chip control circuitry can be used without causing problems. The error rates of the hardware qubits are still well above those that would be needed for error correction to work. However, the researchers suggest that they’ve identified error processes that can potentially be compensated for. They expect that the ability to do industrial-scale manufacturing will ultimately lead to working hardware.

Quantum computing progress: Higher temps, better error correction Read More »

alternate-qubit-design-does-error-correction-in-hardware

Alternate qubit design does error correction in hardware

We can fix that —

Early-stage technology has the potential to cut qubits needed for useful computers.

Image of a complicated set of wires and cables hooked up to copper colored metal hardware.

Nord Quantique

There’s a general consensus that performing any sort of complex algorithm on quantum hardware will have to wait for the arrival of error-corrected qubits. Individual qubits are too error-prone to be trusted for complex calculations, so quantum information will need to be distributed across multiple qubits, allowing monitoring for errors and intervention when they occur.

But most ways of making these “logical qubits” needed for error correction require anywhere from dozens to over a hundred individual hardware qubits. This means we’ll need anywhere from tens of thousands to millions of hardware qubits to do calculations. Existing hardware has only cleared the 1,000-qubit mark within the last month, so that future appears to be several years off at best.

But on Thursday, a company called Nord Quantique announced that it had demonstrated error correction using a single qubit with a distinct hardware design. While this has the potential to greatly reduce the number of hardware qubits needed for useful error correction, the demonstration involved a single qubit—the company doesn’t even expect to demonstrate operations on pairs of qubits until later this year.

Meet the bosonic qubit

The technology underlying this work is termed a bosonic qubit, and they’re not anything new; an optical instrument company even has a product listing for them that notes their potential for use in error correction. But while the concepts behind using them in this manner were well established, demonstrations were lagging. Nord Quantique has now posted a paper in the arXiv that details a demonstration of them actually lowering error rates.

The devices are structured much like a transmon, the form of qubit favored by tech heavyweights like IBM and Google. There, the quantum information is stored in a loop of superconducting wire and is controlled by what’s called a microwave resonator—a small bit of material where microwave photons will reflect back and forth for a while before being lost.

A bosonic qubit turns that situation on its head. In this hardware, the quantum information is held in the photons, while the superconducting wire and resonator control the system. These are both hooked up to a coaxial cavity (think of a structure that, while microscopic, looks a bit like the end of a cable connector).

Massively simplified, the quantum information is stored in the manner in which the photons in the cavity interact. The state of the photons can be monitored by the linked resonator/superconducting wire. If something appears to be off, the resonator/superconducting wire allows interventions to be made to restore the original state. Additional qubits are not needed. “A very simple and basic idea behind quantum error correction is redundancy,” co-founder and CTO Julien Camirand Lemyre told Ars. “One thing about resonators and oscillators in superconducting circuits is that you can put a lot of photons inside the resonators. And for us, the redundancy comes from there.”

This process doesn’t correct all possible errors, so it doesn’t eliminate the need for logical qubits made from multiple underlying hardware qubits. In theory, though, you can catch the two most common forms of errors that qubits are prone to (bit flips and changes in phase).

In the arXiv preprint, the team at Nord Quantique demonstrated that the system works. Using a single qubit and simply measuring whether it holds onto its original state, the error correction system can reduce problems by 14 percent. Unfortunately, overall fidelity is also low, starting at about 85 percent, which is significantly below what’s seen in other systems that have been through years of development work. Some qubits have been demonstrated with a fidelity of over 99 percent.

Getting competitive

So there’s no question that Nord Quantique is well behind a number of the leaders in quantum computing that can perform (error-prone) calculations with dozens of qubits and have far lower error rates. Again, Nord Quantique’s work was done using a single qubit—and without doing any of the operations needed to perform a calculation.

Lemyre told Ars that while the company is small, it benefits from being a spin-out of the Institut Quantique at Canada’s Sherbrooke University, one of Canada’s leading quantum research centers. In addition to having access to the expertise there, Nord Quantique uses a fabrication facility at Sherbrooke to make its hardware.

Over the next year, the company expects to demonstrate that the error correction scheme can function while pairs of qubits are used to perform gate operations, the fundamental units of calculations. Another high priority is to combine this hardware-based error correction with more traditional logical qubit schemes, which would allow additional types of errors to be caught and corrected. This would involve operations with a dozen or more of these bosonic qubits at a time.

But the real challenge will be in the longer term. The company is counting on its hardware’s ability to handle error correction to reduce the number of qubits needed for useful calculations. But if its competitors can scale up the number of qubits fast enough while maintaining the control and error rates needed, that may not ultimately matter. Put differently, if Nord Quantique is still in the hundreds of qubit range by the time other companies are in the hundreds of thousands, its technology might not succeed even if it has some inherent advantages.

But that’s the fun part about the field as things stand: We don’t really know. A handful of very different technologies are already well into development and show some promise. And there are other sets that are still early in the development process but are thought to have a smoother path to scaling to useful numbers of qubits. All of them will have to scale to a minimum of tens of thousands of qubits while enabling the ability to perform quantum manipulations that were cutting-edge science just a few decades ago.

Looming in the background is the simple fact that we’ve never tried to scale anything like this to the extent that will be needed. Unforeseen technical hurdles might limit progress at some point in the future.

Despite all this, there are people backing each of these technologies who know far more about quantum mechanics than I ever will. It’s a fun time.

Alternate qubit design does error correction in hardware Read More »