Part II: What’s Next for Quantum Computing

A version of this article, authored by Kike Miralles, was originally featured in SiliconANGLE.

In the first part of this series, we explored the opportunity of quantum computing and introduced our five-layer stack as a framework for understanding the industry's value chain. This model helps clarify where innovation is happening, from the core quantum processing unit to the software that will eventually deliver value to end-users. 

With this framework in mind, we can look at the forces currently shaping the quantum computing space. As the quantum industry moves from the lab to the fab, five major trends are currently shaping the next wave: 

  • Quantum error correction: While we still see significant effort to make currently available machines useful, the ecosystem now largely agrees that simply scaling up today's noisy (“NISQ”) computers won't unlock the most valuable commercial applications. The industry's focus has shifted to quantum error correction—the key to building robust and scalable fault-tolerant machines. With this shift, we’re seeing increased interest in companies focused on error correction capabilities (e.g., Riverlane, Q-CTRL, QEDMA) as well as significant innovation in how physical qubits get encoded into logical qubits (e.g., from surface codes to so-called quantum low-density parity check codes). 
  • The rise of the middle of the stack: Early on, most quantum computing companies tried the full-stack approach. Now that the industry is maturing, a rich ecosystem of middle-of-the-stack players has emerged. This shift allows companies to focus on what they do best and buy best-in-class components and capabilities as needed—from control systems (e.g., Intel Capital portfolio company Quantum Machines) to quantum software development (e.g., Classiq, Algorithmiq). 
  • Scale-out architectures: For a time, the primary path to more powerful quantum computers seemed to be scale-up, which focuses on creating larger and more capable single quantum processing units (QPUs). However, recent innovations in quantum networking technology have made a different approach, scale-out, a serious contender (e.g., Nu Quantum, Lightsynq, memQ). This strategy involves linking multiple QPUs to work together as one distributed machine, which could even enable different types of qubits to collaborate on a single problem. Transduction from solid-state systems to the optical domain remains a significant challenge, however. Innovation in scale-up hasn't slowed either, with ongoing advances in qubit and architecture design and fabrication techniques continuing to push the space forward (e.g., Alice & Bob, Qolab). 
  • Input-output and cryogenics: Significant innovation is happening in this space, driven by the demands of both scale-up and scale-out architectures. The core challenge lies in controlling and reading out thousands or millions of qubits without destroying their quantum states. For solid-state systems​,​ like superconducting and silicon spin qubits, the traditional solution of running separate wires for each qubit creates significant engineering challenges, as each wire introduces heat and noise into the ultra-cold quantum environment. Additionally, it becomes increasingly more difficult to physically fit an ever-growing number of cables into the refrigerator. Thus, cabling challenges limit QPU scale. Alternatives to reduce both the number of cables and their thermal load are emerging, such as improved density (e.g., Delft Circuits), cryogenic qubit control capabilities (e.g., Diraq with cryo-CMOS) or alternative approaches such as optical fiber (e.g., Qphox). 
  • M&A: When we originally discussed this thesis internally, we predicted that M&A would become a major driving force in the sector. That prediction is now starting to play out at a scale that few, including us, anticipated. IonQ has made several bold acquisitions in 2025, including computing technology (Oxford Ionics), interconnects and memories (Lightsynq), communications (ID Quantique, Qubitekk), and even space (Capella Space). This bold consolidation by an industry leader signals a new path to liquidity and strategic exits for the new wave of innovators. Some companies from the first quantum wave are now able to act as strategics/incumbents. We expect similar dynamics in other computing approaches and across layers to help shape the space over the next decade. 

The journey to fault-tolerant quantum computing remains a marathon. While the challenges are immense, the acceleration of progress across the entire stack is now undeniable. The foundational technologies for a new era of computation are being built today. For founders and scientists with the ambition to build and the patience to persist, solving the bottlenecks to quantum computing at scale presents a massive commercial opportunity. If you are tackling one of these critical bottlenecks, we want to hear from you.