Big Problems in Quantum Computing: Encoding Quantum States

Big Problems in Quantum Computing: Encoding Quantum States

Over the last two decades, quantum computing has made headlines for several reasons. From a promised advantage over classical computers to growing sizes of quantum computers, there seems to be a continuous buzz around the field.

In truth, however, quantum computers run into several big roadblocks, the most commonly referenced of which are hardware noise and decoherence. A much less discussed problem, however, lies in the encoding of memory into a quantum computer.

Quantum State Encoding

A quantum computer by definition applies the quantum mechanical properties of certain particles to represent states and perform computations. The fundamental unit is referred to as a quantum bit, or qubit. Quantum algorithms use these qubits to perform computations. Almost every quantum algorithm expects that a certain data encoding is used.

While algorithms – such as Shor’s algorithm – promise a significant speedup in the complexity, they assume that the encoding process is at most linear or even logarithmic in complexity. In truth, current encoding algorithms are at best exponential. As the data input size – and number of qubits – increases, the time it takes to encode the starting state could cancel out the advantage gained by the algorithm.

See also:

The issue becomes even more clear when considering quantum algorithms that must repeatedly encode and construct quantum states. For example, quantum algorithms have been used to simulate fluid flow in computational fluid dynamics. These fluid simulations compute the state of the fluid over different points in time.

The No Cloning Theorem of quantum mechanics indicates the states of quantum circuits can not be copied, as the act of measurement will destroy the quantum state. As such, the quantum algorithm can not simply be run on a single circuit, which is then measured repeatedly. Instead, one entirely new quantum circuit must be generated and run for every timestep. If encoding is a costly process, repeatedly doing so for every iteration in algorithms such as these will make the use of quantum computers nearly unreasonable.

Current research in the area works to optimize the process used in encoding. Physica Scripta published an accepted manuscript, “Quantum superposing algorithm for quantum encoding,” in late September of this year. The manuscript describes the work of Kim et al. in reducing the number of quantum gates needed to encode a state. By doing so, they may be able to gradually cut down the complexity of encoding. Other work has explored methods of encoding in hardware, ranging from using ion-doped crystals to using degrees of freedom of light to ensembles of neutral atoms (as described in the introduction of this paper).

Conclusion

As such, it is clear that research work on quantum computing – and all the necessary processes for having a fully functioning one – is ongoing. While some groups seek to reduce the gate counts, ultimately having a faster way of loading data into a quantum computer is crucial to achieving an advantage over classical computers. In short: its impact is undeniable.

See also:

Content Creator @ Q-munity

Subscribe to the newsletter

Get a weekly digest of our most important stories direct to your inbox.

No spam. Never ever.

A weekly quantum digest. Delivered to your inbox.

Every week, our team curates the most important news, events, and opportunities in the quantum space. Our subscribers gain early access to opportunities and exclusive interviews with prominent individuals. You don't want to miss out.