The practical prospects of quantum computing for enterprises – Bits&Chips – Bits&Chips

Jordi Mellema and Roy Voetman are BSc graduates from Hanze University of Applied Sciences.
The theoretical promises of quantum computing have been notoriously glorified by pop-science articles in recent years. The practical and feasible applications, however, are still scarce. This begs the question of what companies can do with this technology right now. At CGI, a proof of concept was developed to see how quantum computers could be integrated into existing enterprise software architectures.
There are already several possibilities for performing tasks on present-day quantum computers – more commonly known as noisy intermediate-scale quantum (NISQ) computers. Possible use cases mainly focus on optimizations in, for example, combinatorics or machine learning, which can utilize the noise of these experimental devices. However, although such optimizations are thoroughly theoretically researched, few of these studies have validated their benchmarks in practice. With emerging quantum cloud providers – parties that make quantum computers available via the cloud – the practical validation of theoretical results is becoming more accessible.
When looking at commercial cloud providers, familiar names such as Azure, AWS and Google Cloud emerge. One might think that these companies are building their own quantum computers. However, the actual hardware development is done by research institutions like IonQ, Honeywell and DWave. The cloud providers act as intermediaries between these institutions (quantum providers) and potential customers since, due to their market share, they can reach a larger audience. The odd one in this equation is IBM, which is developing its own quantum computers and adjacently its own cloud infrastructure called IBM Cloud.
At CGI, they were curious to know whether there are potential market opportunities within the current possibilities offered by quantum computing. Since the emergence of quantum cloud providers is a fairly recent development, many aspects of the use of these platforms are still unclear. Therefore, CGI formulated a graduation assignment to gain insight into the current usability of tooling, the conceptual knowledge required to use this tooling, the associated costs and the benefits quantum computing can yield versus traditional algorithms.
We started this project in February 2022. Concretely, a proof of concept (PoC) was created allowing traditional and quantum algorithms to be executed by users through a web interface. These algorithms focused on machine learning (ML) activities such as clustering or classification since quantum computing can potentially find more complex patterns by exploiting the noise of current-day quantum computers.
Our PoC system utilizes IonQ’s 11-qubit quantum computer. Without diving too deep into the technology, we suffice to say here that 1 qubit can encode 1 floating point number. Thus, an 11-qubit computer can cluster a dataset of at most 11 features, for example. The execution of tasks is quite fast. Before execution, however, each task is appended to a queue, which can have a waiting time of 6-12 hours. This significantly impacts the overall processing time.
For programming, each quantum cloud provider offers an API and a software development kit (SDK). Cloud providers can support multiple quantum providers with multiple quantum computers. The SDKs are equipped with a strategy pattern resulting in one generic programming interface that can be used regardless of which quantum computer is used. Our PoC employs dispatchers (Azure Functions) to create new processes that either execute a (machine learning) task directly on traditional hardware or communicate with Azure Quantum as the quantum cloud provider.
Given that the field of quantum computing, and therefore quantum software engineering, is still in its infancy, the available SDKs aren’t as mature yet as classical SDKs. This resulted in a major problem surfacing during the realization of the PoC. Since our system needs to handle quantum tasks that take a substantial amount of time to be executed, due to the long queuing, it’s paramount to resort to so-called Azure Durable Functions, which can perform tasks asynchronously. Unfortunately, the quantum SDK used in this project didn’t support such an asynchronous architecture, probably because it’s still targeted at more educational and research-focused applications, which are more exploratory.
To solve this problem, we had two options: modify the SDK and co-deploy it or have the Azure Functions run fully synchronously and wait for the process to finish on the quantum computer. Overriding the SDK wasn’t an option since that would negatively impact the system’s portability, as the local version needs to be merged with each SDK update. Therefore, we chose to execute the Azure Functions synchronously. A drawback of this solution is that the Azure Function has to wait for the process to finish on the quantum computer, which may include a restart in the event of a crash.
The purpose of the PoC was to explore the possible market opportunities for present-day quantum computing, taking into account four key points: tool usability, conceptual knowledge, associated costs and benefits of quantum computing. In terms of usability, we can say that the current tooling is mature enough to create experimental applications. It does come with some limitations such as the lacking support for the asynchronous execution of quantum algorithms. As a result, the algorithms need to be executed synchronously from start to finish including the queue waiting time. This is undesirable in an enterprise environment since cloud solutions are intended to execute tasks that don’t take too long and that can be picked up again immediately in case of an error.
For the required conceptual knowledge, we can say that the SDKs offer a considerably high level of abstraction, enabling the practical application of quantum software engineering. Quantum SDKs have implemented several standard techniques that can be seen as building blocks. A software engineer should be able to combine these building blocks to adapt algorithms to a relevant business context.
With regards to the costs, we compared several providers. Some require an upfront payment ($125,000 in the case of Honeywell), while others charge a fixed fee (at least $1 per quantum task in the case of IonQ). Looking at how many quantum tasks are required to run an algorithm, the latter pricing model isn’t very realistic either. For example, when clustering 100 data points, the number of tasks required is already 4,950.
Finally, based on our research, we’re unable to say what advantages present-day quantum computing brings over the traditional alternatives. At least for machine learning, our study indicates that there currently seems to be no actual benefit because classical counterparts deliver the same, if not better, results.
In conclusion, while our PoC demonstrates that software engineers can work with this new technology, it’s not yet applicable in an enterprise setting. The current tooling falls short for such complex environments, the costs are prohibitive and, with regards to quantum ML, classical models give similar results. The potential of quantum computing will undoubtedly continue to unfold over the next years, but for now, the costs don’t outweigh the theoretically conjectured results.
Every few weeks, it appears, we’re treated to another breakthrough innovation in artificial intelligence. Whether it’s Deepmind’s system beating the…
Researchers at Qutech have engineered a record number of six silicon-based spin qubits in a fully interoperable array. Importantly, the…
By measuring the current outputs of battery packs much more accurately, a Japanese quantum sensor could extend the driving range…
The German Aerospace Center (DLR) has signed off on a 14-million-euro collaboration with Enschede-based Quix Quantum to develop a 64-qubit…
Bits&Chips strengthens the high tech ecosystem in the Netherlands and Belgium and makes it healthier by supplying independent knowledge and information.
Bits&Chips focuses on news and trends in embedded systems, electronics, mechatronics and semiconductors. Our coverage revolves around the influence of technology.

source

Leave a Comment