Online Engineering Programs Home

Is Quantum Computing Finally Ready for Primetime?

Find schools

Meet the Expert: Daniel D. Stancil, PhD

Daniel D. Stancil, PhD

Dr. Daniel Stancil is the Alcoa Distinguished Professor and the executive director of the IBM Quantum Hub at NC State. He holds engineering degrees from Tennessee Tech (BSEE) and MIT (MS, EE, and PhD). He has spent many years as a professor of electrical and computer engineering at both Carnegie Mellon University and NC State.

While at CMU, he served as associate head of the ECE Department and associate dean for academic affairs in the College of Engineering. He was head of the ECE Department at NC State from 2009 to 2023.

Dr. Stancil’s research has included such varied topics as magnetic films, optics, microwaves, wireless channels, antennas, remote labs, and particle physics. The demonstration of neutrino communications by a multidisciplinary team coordinated by Dr. Stancil was recognized by Physics World Magazine as one of the Top 10 Physics Breakthroughs of 2012. His work has received additional recognitions including an IR 100 Award and a Photonics Circle of Excellence Award. Dr. Stancil is a Fellow of the Institute of Electrical and Electronics Engineers and a past president of the IEEE Magnetics Society.

The Evolution of Quantum Computing

The origins of quantum computing date back to the theoretical. Physicist Richard Feynmann first proposed the idea for quantum computing in 1981, suggesting a quantum computer could effectively simulate the quirky phenomena of quantum mechanics. Peter Shor and Lov Grover proposed the first quantum algorithms in the early 1990s. But it wasn’t until the 21st century that the first functioning quantum computers were developed. 

Today, there’s an important distinction to make between noisy intermediate-scale quantum (NISQ) systems and error-corrected or fault-tolerant quantum systems. The former is called “noisy” because they are more susceptible to error from quantum decoherence and other sources of interference. It is believed that NISQ systems will still be useful for certain applications, and, in fact, all of the existing quantum computers are considered NISQ machines.

But fault-tolerant quantum systems attempt to detect and correct errors as they occur. In theory, that would make them more powerful and reliable than their NISQ counterparts, allowing them to execute long and complex quantum algorithms without significant fidelity loss. 

“Five years ago, people thought that fault-tolerant quantum computing was probably several decades off,” Dr. Stancil says. “Now, the time horizon is considerably shorter than that. The hardware has improved at a remarkable pace.”

One way quantum engineers detect and correct errors is through redundancy, using multiple physical qubits to preserve a single piece of information. These stable clusters of qubits form what are known as logical qubits. How many physical qubits are required to form a logical qubit depends on how large the errors are and how good the underlying code is; how many logical qubits are needed depends on how complex the computation will be.

Five years ago, it was thought that a single logical qubit might require as many as 1,000 physical qubits—and that was at a time when most quantum computers topped out at 10 to 20 physical qubits in total. Building a machine with 50 to 100 logical qubits seemed infeasible. 

But today’s chips can include a larger and larger number of qubits, and there’s been significant progress on the accuracy of error-correcting codes. In April 2024, Microsoft and Quantinuum claimed to demonstrate the most reliable logical qubits on record, boasting an error rate 800 times better than physical qubits. 

Advances in AI and ML are helping advance fault-tolerant quantum computing, too. AI is good at classifying the sometimes noisy signals that come off quantum hardware. And some ML algorithms implemented on quantum processors are strongly indicated to be more accurate than classical ML algorithms. But quantum is still waiting for its killer application.

The Applications of Quantum Computing

“What people are looking for now is a demonstration of quantum advantage: something that they can do on a quantum computer that provides value either in speed or accuracy over what they can do for a similar cost with a classical computer,” Dr. Stancil says. “Those use cases do not yet exist. But I think we’re closing in on them. And, particularly with this rapid increase in capability, I think it’s very likely that in the next one to three years, there will be some type of quantum advantage demonstrated.”

The first demonstrations of quantum advantage are unlikely to be earth-shaking. Progress will be gradual, as one demonstration of advantage informs an adjacent use case, and so on. But quantum is not a universal solution. It has bottleneck issues with its input-output, and, while it may outperform classical computers in some areas, it will continue to underperform them in others. 

“In my view, the sweet spot for quantum computing is in problems that are hard computationally, but not enormous in terms of the data set size,” Dr. Stancil says. 

One of quantum’s most talked-about use cases is its impact on encryption. The idea dates back to 1994 when Peter Shor developed a quantum algorithm capable of factoring large integers exponentially faster than the best-known classical algorithms. Applied on a large enough scale, the algorithm could factor numbers of a size used in the type of real-world cryptography that secures most modern-day systems. While quantum hardware still hasn’t quite gotten to that level, engineers are already at work on quantum-proof encryption. 

“The really exciting applications are things that haven’t been possible before,” Dr. Stancil says.

In the near term, Dr. Stancil and other experts are looking for quantum computers to solve optimization problems and improve financial models. Demonstrating quantum advantage, and providing some economic value in doing so, would be a significant step forward. 

From there, the possibilities get more exponential and aspirational: understanding large molecules and materials beyond the capability of classical computing, possibly unlocking impactful drug discoveries and battery chemistry designs. In 2023, IBM deployed its Quantum System One at the Cleveland Clinic, the first quantum computer in the world to be uniquely dedicated to healthcare research, with an aim to help accelerate biomedical discoveries.

Building the Future of Quantum Computing

“In my view, quantum computing will never supplant classical computing,” Dr. Stancil says. “The reason for that is quantum computers do certain things better than classical computers and can do certain things that classical computers can’t, but there are things that classical computers can do better than quantum computers, and probably always will.”

Dr. Stancil envisions quantum computing becoming a sort of hardware accelerator, where classical computing will continue to take care of most everyday tasks while complex problems are sent off to a quantum processor. Interestingly, this symbiotic relationship means that advances in quantum computing don’t create job displacement the same way advances in AI or driverless vehicles do: better quantum computing may mean more demand for classical computing, not less. 

The future of quantum will be determined by the up-and-coming generation of young engineers. Dr. Stancil recommends aspiring quantum engineers get a solid foundation in Python, linear algebra, and the quantum mechanics of two-state systems. But it’s also important to come into quantum with an open mind. Classical computing, too, once seemed like science fiction, and what comes next is yet to be decided. 

“The thing that is really exciting and exhilarating about quantum computing is it is so multidisciplinary,” Dr. Stancil says. “It goes from fundamental physics on one side to theoretical computer science on the other side, and in the middle, it has all kinds of engineering problems and applications. It’s a breathtaking span of intellectual topics.”

Related Features

An Expert’s Guide to Using Digital Twins

Today, digital twins are not limited to just physical objects. With the rise of virtual and augmented reality technologies, digital twins can now replicate entire environments and systems in a virtual space. This has opened up new possibilities for testing and simulation, allowing companies to reduce costs and risks associated with physical prototypes.

Artificial Intelligence in Job Recruitment: How AI Can Identify Top Talent

Diversity and inclusivity aren’t purely idealistic goals. A growing body of research shows that greater diversity, particularly within executive teams, is closely correlated with greater profitability. Today’s businesses are highly incentivized to identify a diverse pool of top talent, but they’ve still struggled to achieve it. Recent advances in AI could help.

Artificial Intelligence Systems & Specializations: An Interview with Microsoft’s Sha Viswanathan

The ability of a computer to learn and problem solve (i.e., machine learning) is what makes AI different from any other major technological advances we’ve seen in the last century. More than simply assisting people with tasks, AI allows the technology to take the reins and improve processes without any help from humans.

Building Web3: Expert Interview on Non-Fungible Tokens (NFTs)

Unlike fungible items, which are interchangeable and can be exchanged like-for-like, non-fungible tokens (NFTs) are verifiably unique. Broadly speaking, NFTs take what amounts to a cryptographic signature, ascribe it to a particular digital asset, and then log it on a blockchain’s distributed ledger.

Building Web3: Smart Contracts, Solidity, and the Ethereum Network

First proposed by computer scientist Nick Szabo in the 1990s and later pioneered by the Ethereum blockchain in 2010, smart contracts are programs that execute themselves when certain predetermined conditions are met.