Home Quantum Computing Will Always be Hybrid, and that Calls for Orchestration

Quantum Computing Will Always be Hybrid, and that Calls for Orchestration

The modern computing revolution was driven by the evolution of the central processing unit (CPU), which became smaller and more complex over time. That evolution culminated in the microprocessor, the dominant form of CPU today. Along the way, more specialized chips — graphical processing units (GPU), field-programmable gate arrays (FPGA) and application specific integrated circuits (ASIC) – emerged. Each of these specialized chips accelerated and improved different dimensions of processing performance and unlocked new capabilities in computing.

With the advent of quantum, we are poised for the next evolutionary step in computing power.

Each new compute option has contributed to the increasing hybridization of computing. Instead of simply sending jobs to a CPU, we can now compute across a range of exotic devices, each uniquely suited for solving a particular set of problems.

The proliferation of compute options increases the overall complexity of our computing environments. This complexity poses two challenges. First, there is the design challenge of creating stable and scalable architectures to facilitate the execution of jobs that call for multi-device computing.

Second, there is the challenge of actually running these jobs in an efficient, optimized, and repeatable way. In other words, we not only have to design multi-device architectures, but we also need to orchestrate computing across them.

Thinking about things in this way allows us to quickly understand why the quantum stack. Quantum stack is a stack incorporating quantum computing devices – that have been hybrid out of the gate.

The architecture of the stack necessarily involves the combination of classical and quantum computing devices. Even within a single quantum algorithm today, compute is shared between classical and quantum processors.

The architecture of the quantum stack reflects the complexity.

This complexity is also compounded by the fact that, much like access to high-powered GPUs or HPC resources in other architectures, access to quantum devices now and into the future will be remote.

At the same time, in an effort to protect their evolving IP, organizations experimenting with quantum capabilities will also rely heavily on their own on-premises and private cloud assets.

Quantum hardware and software continue to evolve.

Because both quantum hardware and software continue to evolve, the architecture of the quantum stack and the orchestration of its components must allow for a certain level of “swap ability.”

That is, quantum architectures must facilitate a level of flexibility making it possible for organizations to experiment with new technologies and new ways of orchestrating it without being locked in to any one solution. An emphasis on interoperability in the design of quantum-related technologies anticipates this ongoing need for adaptability.

The Hybrid Nature of the Quantum Stack

Aside from describing some of the unique characteristics of hybrid quantum architectures, we hope to make a couple things clear. First, the hybrid nature of the quantum stack reflects the wider hybrid trend we see in architectures involving a range of exotic compute devices.

Second, the inherent differences between quantum devices and the full range of classical devices mean we should not think of quantum as replacing classical. Instead, we should view quantum devices as tools for solving their own set of problems.

Finally, the complexity inherent in hybrid architectures demand orchestration tools that both simplify and optimize their performance.

Classical vs Quantum: Relative Strengths

Classical and quantum devices have relative strengths that, at least in part, reflect their relative maturity levels. The earliest mechanical computing devices date from the mid-1800s, with the first programmable computer appearing in the mid-1930s. Since then, classical computers have continuously evolved, roughly at the pace of Moore’s Law. Today, they perform an incredible range of functions up to and including the simulation of quantum devices.

Quantum Computing in the 20th Century

Quantum computing, on the other hand, is wholly a product of the 20th Century. The theory of quantum physics only coalesced in the 1920s and Richard Feynman didn’t propose the basic idea for a quantum computing device until 1982. That being said, quantum processing technology is approaching a tipping point where it will soon outperform classical devices in certain scenarios.

Quantum devices — exponentially more powerful

As quantum devices continue to improve, they will become exponentially more powerful than even the most advanced classical devices for certain tasks. The reason for this lies in the basic premise of quantum computing itself.

Whereas classical devices rely on binary bits that can either have a value of one or zero, quantum devices rely on qubits that can exist in a linear combination of both states at once.

The state of a qubit can also become entangled with the state of other qubits, meaning the behavior of one qubit can influence the behavior of many. Thanks to these unique characteristics, adding more qubits produces a network effect that rapidly gives quantum devices more compute power than their classical alternatives.

Given these differences, how should we think about the relative strengths of classical and quantum computing devices?

Now and into the future, classical computing will be best for everything from data preparation and parameter selection to post-processing, graphing, and certain types of data analysis. High performance computers and supercomputers are also, for the time being, best for analyzing massive datasets.

Of course, the advantages that classical devices possess in certain contexts are not solely due to the inherent nature of these devices. They also stem from the fact that there are established best practices, optimizations and ecosystems of tools focused on these use cases.

Strengths of quantum

One of quantum’s relative strengths lies in its ability to draw information from small datasets by extensively analyzing the data from multiple directions.

This is especially helpful when data is difficult to come by and these capabilities will have a major impact on the evolution of machine learning and modeling complex but rare phenomena (such as financial crises and global pandemics).

Quantum computing allows for enhanced ability to sample from probability distributions that are otherwise hard to sample using classical techniques. This has a number of applications in solving optimization and machine learning problems, such as generative modeling.

Finally, as Richard Feynman first suggested, quantum devices can be used to model quantum systems, such as molecular interactions, in ways that classical devices never could because they are not themselves quantum systems.

Quantum devices are not intended to replace classical devices.

Instead of replaceing — these devices will be employed to solve specific problems — particularly problems that are intractable on classical computers.

A perfect example of such a problem is the Traveling Salesman Problem, where one aims to find the shortest route for a person to visit each city in a list. Along these lines, the intrinsic capabilities of quantum technology will enable it to accelerate advancements in biology, chemistry, logistics and material science.

The Future Is Hybrid

The entire landscape of computing has been trending towards a hybrid model for some time. Quantum computing will follow this trend primarily because it too offers a specialized form of computing power.

More important than the specific engineering reasons to adopt a hybrid approach are the business reasons to do so. Adopting a hybrid approach lowers the barriers to entry and allows organizations to begin experimenting and make progress with quantum in a flexible, cost-efficient way.

Since few companies will want to invest in (or be able to afford) quantum hardware in these early days, it makes sense that they build out classical architectures that access quantum devices as needed.

Where will quantum do its best work?

Organizations where a quantum disruption is widely anticipated – chemistry and materials science, pharmaceuticals, financial services, logistics, security, etc. – should be especially focused on developing these architectures, and cultivating other essential resources with an eye to quantum readiness.

In addition to classical computing capabilities, these resources include the talent and internal expertise that quantum demands.

Quantums future

Looking beyond the present, quantum computing may always be a “hybrid” technology. First of all, it will always be overkill to use quantum computing to do things classical computers already do well. Second, cost will remain an issue. Quantum devices are and will be expensive and specialized. Using them to do things that advanced computing systems can already do is simply uneconomical.

Finally, we return to a point we made above: Because quantum computing can and should be applied to different problems than those classical computers can solve, the real business challenge is identifying exactly those problems or aspects of problems in a particular industry for which quantum devices are best suited.

Orchestration and the Hybrid Approach

When we talk about the need for orchestration, we can learn something from the world of hybrid cloud infrastructure. With 69% of enterprises having already adopted a hybrid cloud approach, the complexity involved has led many organizations to embrace cloud management. And this management, as in the management of cloud native architectures, takes the form of orchestration.

A hybrid quantum stack, especially one that relies on both cloud and on-premises/private cloud resources, will likewise require management and orchestration to ensure that programs, experiments and processes run smoothly

Such orchestration requires a workflow management tool abstracted from the underlying hardware. Abstraction is necessary in part due to the proliferation of quantum devices and associated tools.

To efficiently experiment with this ever-expanding toolset, organizations need the flexibility to move from one hybrid configuration to the next without rewriting everything based on the underlying hardware. An effective workflow management system must facilitate such interoperability.

Quantum backends

For example, as new quantum backends become available, orchestration should make it possible to switch from one to another in a single line. Similarly, orchestration should support the ability to change the optimizer used in a variational quantum algorithm to compare performance without writing additional code.

Finally, orchestration should make it possible to combine source code from multiple frameworks and libraries, eliminating the drudge work of standing up new environments and freeing up time to focus on running actual experiments.

Scaling work

In order to scale the work, a certain level of hardware agnosticism when building and working with a hybrid quantum architecture is necessary. Orchestration tools must be adaptable not only to account for the diversity in existing hardware but also to account for whatever else may come down the pike.

The number of advances we have seen in the last year alone highlights the fact that these workflow management and orchestration tools must be able to keep pace with the accelerated evolution of quantum technology. Indeed, the adaptability these tools offer will itself drive the broader adoption of quantum techniques.

Microprocessors today bear little resemblance to the tube-based central processing units of yore.

In fact, the current iPhone has one million times more RAM, seven million times more ROM, and processes information at a speed 100,000 times faster than the computers used to land Apollo 11 on the moon and bring it back again.

Quantum processors, as they mature, will end up putting the same distance between themselves and current classical computing devices, making them optimal for problems that even the best of these devices cannot solve.

Although such comparisons point to the monumental changes that quantum computing will bring about, as we have indicated, tapping into that power, now and into the future, will require both quantum and classical devices working together in a hybrid model.

It is in this way that companies will be able to tackle a broad range of business problems. But it will go beyond that. As these hybrid machines transform security and machine learning, they will impact every aspect of our daily lives.

Conclusion

From a purely practical perspective, a hybrid approach is the most efficient, cost effective and productive way to approach quantum. Leaning on classical devices to perform those tasks in the quantum computing process for which they are best suited and for which they have been optimized over the last fifty years is not only the right path, it is the only best path.

The reason for this is that, as we have argued, quantum devices and classical devices don’t just solve problems differently; they solve different problems. This is the case today as it will be five and ten years from now.

That’s why it’s a bit of a misnomer to say that “quantum computing will do this or that.” The fact of the matter is, the real revolution will be driven by the combined power of classical and quantum in more and more powerful hybrid solutions.

Image Credit: Michael Dziedzic; Unsplash

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Tim Hirzel has a BA in Computer Science from Harvard University and an MS from MIT’s Media Lab. He brings extensive experience in managing teams working on data science, machine learning, quantum chemistry, and device simulation to his current role as Zapata Computing's Chief Product Officer. Since 2005, Tim has been a software engineer and architect in science-based technology startups. Today he is focused on delivering a best in class quantum computing platform for Zapata and its customers.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.