火, 07 4月 2020 08:00

Quantum-hype during COVID-19

Qu&Co comments on this publication:

At Qu&Co we always restrained ourselves from reacting to exaggerated claims about the short-term potential of quantum-computing. Rather we focused on our scientifically rigorous work to advance the field of quantum as we strongly believe in its long-term potential. However, we draw the line at quantum being pushed as a short-term solution for researchers working on COVID, like this WSJ article in which a quantum hardware manufacturer offers free hardware access to researchers studying COVID, stating ‘we have a fairly unique system that could add value’. Although this offer could be a misplaced April-fools joke, we want to stress that, although quantum has strong long-term potential, there is zero chance it will provide any short-term value for COVID research. Therefore, no serious researchers working on the current pandemic should be distracted by this offer. If you are determined to use novel methods to solve today’s combinatorial optimisation problems, perhaps try simulated annealing on a purpose-built classical processor. And of course, if your time horizon is >2 years and you want to work on collaborative quantum-algorithm R&D, without distracting scarce COVID R&D staff, we are here to help. Stay safe and focused!

Qu&Co comments on this publication:

Quantum computers are envisioned to have significantly higher computational capabilities compared to their classical counterparts, especially for optimization and machine learning problems that involve a large classical data set. However, existing quantum algorithms use the trivial methods of turning large classical datasets into either quantum oracles or quantum states which are so expensive that negate any possible quantum advantage. Such quantum algorithms focus at problems in which classical runtime scales rapidly with the input size, perhaps exponentially. To be able to achieve quantum speedup with algorithms like Grover search, a “quantum RAM” is proposed, which is a large classical memory that can be queried in superposition. Although quantum RAMs do not yet exist and creating one might encounter the same challenges that quantum computer hardware faces, it has the potential to provide significant speedup to applications like the k-means clustering, logistical regression, zero-sum games and boosting.

This paper introduces hybrid classical-quantum algorithms for problems involving a large classical data set X and a space of models Y such that a quantum computer has superposition access to Y but not X. Then a data reduction technique is used to construct a weighted subset of X called a coreset that yields approximately the same loss for each model. The coreset can be constructed either by a classical computer or by the combination of classical – quantum computer by utilization of quantum measurements.

The main message of this work is that in order to avoid losing quantum speedup for ‘big-data’ applications, techniques such as data reduction are required, so that the time for loading and storing the data set is limited. Also, non-fault tolerant quantum algorithms should be designed in a way that does not require an excessive amount of gates, so that the algorithm can run before qubits lose their coherence and invalidate the result. The goal of the paper is to draw attention to problems that arise from such actions like testing for quantum advantage when data reduction is used, explore general data reduction techniques and investigate new hybrid classical-quantum algorithms.

Qu&Co comments on this publication:

Quantum computers were initially proposed to efficiently simulate quantum mechanical systems with an exponential speedup compared to classical computers. We are currently in the noisy intermediate-scale quantum (NISQ) era, which means quantum chips still have a small number of qubits. This prohibits straightforward quantum simulations of realistic molecules and materials, whose description requires hundreds of atoms and thousands to millions of degrees of freedom to represent the electronic wavefunctions. One research direction which attempts to bypass this restriction is the development of hybrid quantum-classical methods where the quantum computation is restricted to a small portion of the system.

In this paper, a quantum embedding theory is proposed for the calculation of strongly-correlated electronic states of active regions, while the rest of the system is described with density functional theory (DFT). DFT (and its various approximations) has been extremely successful in predicting numerous properties of solids, liquids and molecules, and in providing key interpretations to a variety of experimental results, but is often inadequate to describe strongly-correlated electronic state. The novel theory proposed in this paper is built on DFT and does not require the explicit evaluation of virtual electronic states, thus making the method scalable to materials with thousands of electrons. Also, it includes the effect of exchange-correlation interactions of the environment on active regions, thus going beyond commonly adopted approximations in conventional DFT.

The proposed quantum embedding theory utilizes a classical and a quantum algorithm to solve the Hamiltonian that describes the problem and yields results in good agreement with existing experimental measurements and still-tractable computations on classical computing architectures. The theory is tested in various solid-state quantum information technologies, which exhibit strongly-correlated electronic states. In this way, the authors show how the quantum-classical hybrid approach incorporating DFT enables the study of large-scale material systems while adding the strongly-correlated dynamics analysis which the quantum simulation algorithm can provide.

Qu&Co comments on this publication:

In this arXiv submission by Qu & Co and Covestro, a well-known approximation in classical computational methods for quantum chemistry is applied to a quantum computing scheme for simulating molecular chemistry efficiently on near-term quantum devices. The restricted mapping allows for a polynomial reduction in both the quantum circuit depth and the total number of measurements required, as compared to the conventional variational approaches based on near-term quantum simulation of molecular chemistry, such as UCCSD. This enables faster runtime convergence of the variational algorithm to a potentially higher accuracy by using a larger basis set allowed by the restricted mapping. The latter is shown via an example simulation of the disassociation curve of lithium hydride. These results open up a new direction for efficient near-term quantum chemistry simulation, as well as decreasing the effective quantum resource requirements for future fault-tolerant quantum computing schemes.

Qu&Co comments on this publication:

Recently some contributors to a paper describing a quantum-supremacy experiment inadvertently posted an older version of this paper online, which was quickly picked-up by the popular press resulting in a flurry of (in many cases) unfounded claims about the progress of quantum-computing. We believe that it is important for people interested in this topic to inform themselves through reading a balanced opinion from someone who is an expert in this field. Therefore we kindly refer to Scott Aaronson's excellent blogpost on this matter. 

Qu&Co comments on this publication:

In this article by McKinsey & Co, a strategy consulting firm, Florian Budde and Daniel Volz state that the chemical companies must act now to capture the benefits of quantum computing. Of course we at Qu & Co are a bit biased on this topic, but we do agree with the authors that the chemical sector is likely to be an early beneficiary of the vastly expanded modeling and computational capabilities, which is promised to be unlocked by quantum computing.

Qu&Co comments on this publication:

In this market outlook, The Boston Consulting group assesses how and where quantum computing will create business value, the likely progression, and what steps executives should take now to put their firms in the best position to capture that value. The report is based on interviews and workshops involving more than 100 experts, a review of some 150 peer-reviewed publications, and analysis of more than 35 potential use cases. 

Qu&Co comments on this publication:

Currently, the latest state-of-the-art quantum computers are so-called NISQ (noisy intermediate-scale quantum) devices, meaning they have a number of qubits which approaches competition with classical simulation of the output of such systems, yet the systems are noisy and no fault-tolerance can be achieved yet. The question is: are there methods which can sufficiently compensate for their noisy nature, enabling the emergence of quantum advantage on these devices? In recent years, many error correction and mitigation schemes have been developed: from Richardson extrapolation techniques to extend results down to `zero noise’, to parity check measurements and more. But typically, those techniques require additional complicated circuitry, ancillary qubits, pulse modifications, or calibration/tuning steps. In this paper, an alternative strategy based on the general principle of a class of methods called Quantum Subspace Expansion (QSE) is proposed. In this strategy, one performs clever post-processing of classical data with or without additional measurements with (at most) simple additional operations in the circuit and no (scaling) ancillary qubits. This paper generalizes the application of QSE error mitigation to any quantum computation, not restricting itself necessarily to problem-specifics like chemistry. Another interesting idea presented here is to use NISQ devices to experimentally study small quantum codes for later use in larger-scale quantum computers implementing error correcting code, such as in future FTQC (fault-tolerant quantum computing).

Qu&Co comments on this publication:

Thus far, quantum chemistry quantum algorithms have been experimentally demonstrated only on gate-based quantum computers. Efforts have been made to also map the chemistry problem Fermionic Hamiltonian to an Ising Hamiltonian in order to solve it on a quantum annealer.  However, the number of qubits required still scales exponentially with the problem size (the number of orbitals considered in the electronic structure problem). As an alternative, this paper presents a different approach exploiting the efficiency at which quantum annealers can solve discrete optimization problems, and mapping a qubit coupled cluster method to this form. They simulate their method on an ideal Ising machine and on a D-Wave 2000Q system, and find promising success rates for smaller molecules. However, further investigation would be necessary to investigate the usability for larger or more complex systems, as the scaling of their folding technique with the number of local minima is unknown. In addition, it is unclear from the experimental data whether the limitations of the D-Wave system  as compared to a perfect Ising machine could hinder expected performance gains for more complex systems.

Qu&Co comments on this publication:

This report by The Boston Consulting Group, a strategy consulting firm, targets business executives and other people looking for a broader market overview on quantum computing. The authors (Philipp Gerbert et al.) provide some insight in where the technology currently stands, who is who in the emerging ecosystem, and the potentially interesting applications. The report also analyzes some of the leading indicators of investments, patents, and publications, which countries and entities are most active and the status and prospects for the main categories of quantum hardware technologies. Additionally, the report aims to provide a simple framework for understanding quantum algorithms and assessing their applicability and potential. Finally, the authors provide their view of what can be expected in the next five to ten years, and what corporates should be doing, or getting ready for, in response. 

月, 27 8月 2018 00:00

Variational quantum factoring

Qu&Co comments on this publication:

Ever since the publication of Shor’s algorithm in 1994, efficient integer factorization has been a key application area envisioned for quantum-computers, with important implications for the security of some of the most used cryptosystems. Because Shor’s algorithm requires a large-scale fault-tolerant quantum-processor, RSA-3072 encryption was so-far believed to remain safe until 2030. However, in recent years hybrid (classical-quantum) alternatives have been developed for many important quantum-algorithms. Such hybrid algorithms can be run on current-day noisy and small-scale quantum-processors. In this paper Eric Anschuetz et al. describe such a hybrid alternative for Shor’s algorithm, which they call variational quantum factoring (VQF). If some pre-processing is applied VQF scales with O(n), n being the number of bits of the integer being factored. If VQF can be optimized to scale well up to 3000+ qubits, which is very challenging, but not completely unthinkable, and if we assume the number of physical qubits in quantum-processors doubles every year, quantum-processors could have sufficiently high qubit count to break RSA-3072 as early as 2025. However, as VQF relies on a quantum-optimization algorithm (QAOA) it seems unlikely that the speed-up of VQF could be more than quadratic, which means that the runtime for breaking RSA-3072 could very well be prohibitively long and that doubling the RSA-6144 (double the key-length) would again be just  as safe as RSA-3072 is currently.

水, 11 7月 2018 00:00

Quantum computing for finance

Qu&Co comments on this publication:

Many financial services players are experimenting with quantum-computing so that they can be the first to start exploiting its benefits in speed-up and tractability. Algorithms have been developed for a wide range of finance related topics e.g. Monte Carlo simulation, portfolio optimization, anomaly (fraud) detection, market forecasting and reduction of slippage. In this paper Orus et al. provide a nice overview of most of these applications. Although the paper puts much emphasis on what has been done with quantum-annealers, applying the Quantum Approximate Optimization Algorithm (QAOA) lets us map all of them to universal-gate devices, which ensures that these applications stay relevant even when annealers become obsolete.

3 / 9

私たちにどんなお手伝いができるでしょうか?
以下のお問合せフォームからご連絡ください。

Invalid Input

Invalid Input

Invalid Input

Invalid Input

Invalid Input

Invalid Input

Copyright © Qu & Co
close