Home Futurist’s Cheat Sheet: Quantum Computing

Futurist’s Cheat Sheet: Quantum Computing

Moore’s Law describes the phenomenon that makes this year’s computer more capable and less expensive than last year’s. But it won’t go on forever. While engineers have come up with various schemes to keep it rolling, quantum computing is the best hope for extending it indefinitely. The concept has been proven in the lab, but working quantum computers are not a foregone conclusion. Here is a quick-and-dirty primer on a very complicated technology. (The “Futurist’s Cheatsheet” series surveys technologies on the horizon: their promise, how likely they are, and when they might become part of our daily lives. This article is Part 3.)

What It Is

Computers as we know them manipulate information encoded in ones and zeroes, known as binary code. Instead of the customary binary computation, quantum computers will encode data in the quantum states of subatomic particles known as quantum bits, or qubits. These devices will be able to execute parallel computations orders of magnitude faster than today’s computers.

The website AskAMathematician.com describes quantum computing, “A quantum computer can take many inputs, do many calculations, and produce many results at the same time.”

The concept has been validated by systems involving very small numbers of qubits. However, cobbling together full-scale quantum computing systems remains a distant prospect.

How It Works

Today’s computers are built on silicon-based chipsets whose physical features are becoming smaller every year. At the same time, the speed at which these circuits run becomes faster, leading to ever more tiny and powerful devices.

Quantum computing takes the action of computing from the molecular scale of silicon to subatomic scale of particles such as electrons or photons. At this scale, paradoxical quantum phenomena come into play. A qubit can be either a one or a zero or both at the same time. Add qubits one by one and the computational possibilities start to rise astronomically.  

Potential Impact

There will be no greater advance in computing than quantum computing within our lifetime. In many ways, it is the Holy Grail of advanced mathematics. Once scientists can create quantum computers that are stable and reliable, advanced math that is nearly impossible today will be relatively trivial.

What are the possibilities of such computation? Really, there could be no limit. Scientists could figure out how to build smarter computers that could do anything asked of them (think Star Trek: “computer, make me a cup of tea”) or design engines that could make deep space flight feasible. One of the most interesting possibilities of quantum computing would be the creation of true artificial intelligence. 

Challenges

Building qubits is extraordinarily difficult. Researchers have completed promising basic experiments, but the largest quantum computer to date has incorporated only seven qubits.

Generating qubits and getting them to work together is only half the job. The other half is coaxing them to turn input into meaningful output. Researchers are working on algorithms that form the basis of quantum digital processing.  

Timeline

Gartner’s latest Hype Cycle report puts quantum computing in the “more than 10 years away” category. That might be generous, as the technical obstacles are daunting. Quantum computing on a large scale may never come to fruition. The earliest implementations will be in supercomputers unavailable to the general public. In the meantime, we will just have to make do with standard computation.

Further Information

Quantum computing as a vast field with new research published almost every month. The overview above is but a simple attempt at an explanation. Check out the resources below for additional insight into quantum computing.

Ars Technica: A tale of two qubits: how quantum computers work

Ask A Mathematician: How does quantum computing work?

How Stuff Works: How Quantum computers work

Andrew Steane, University of Oxford (1997): Quantum Computing

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.