Progress in experimental quantum computing has come a long way. For quantum computing to realise its full potential, we will need to build systems around quantum computers. Some of these systems will be used to realise the promise of fault-tolerant quantum computation, while others will be dedicated to compiling useful quantum programs in the most efficient way possible.
On the fault-tolerant aspect of quantum computation, the most promising path is the implementation of Quantum Error Correction (QEC) in these systems. Without QEC, quantum computers are limited from reaching their full potential because of errors arising from various sources of noise – these error rates cause failures in quantum operations at a rate of 1 in 10000, when we really need them to be at least 1 in a billion or fewer. QEC is the only viable path to make this happen. While much progress has been made in theory and simulation of QEC, the real challenge, and opportunity, lies in moving QEC into hands-on experimental workflows on actual quantum hardware.
On the compilation of useful quantum programs onto quantum hardware, the methods and processes for orchestrating the several control systems driving qubits and their error-correction modules to realize a useful computation will likely require high-performance classical computers working in concert with the quantum systems.
The Quantum Science Center (QSC), headquartered at ORNL is uniquely positioned to address both fronts of this challenge, as the home of some of the nation’s most powerful high-performance computers (HPC), and through the availability of quantum expertise at the centre. With these resources, what remains is to bring experts across HPC and quantum together to envision and build the different components of a QHPC ecosystem.
This is precisely what leading researchers in quantum computing tackled as part of a recent Quantum Error Correction Workshop at ORNL with the U.S. Department of Energy’s (DOE’s) QSC, IQM Quantum Computers, and Riverlane.
The workshop focused on technical exchanges, specifically on how to develop and coordinate activities across classical and quantum computing to realise quantum-centric HPC applications using fault-tolerant methods.
National laboratories play a critical role in validating scalable approaches to quantum computing. The DOE is driving collaboration among the national labs, pushing them to coordinate toward the shared goal of building a utility-scale quantum computer.
QEC is a foundational capability for future hybrid quantum-centric HPC systems. The resulting workflows will tackle problems currently beyond the reach of classical
computers. For institutions such as ORNL, which are leaders in high-performance computing, understanding how QEC integrates with their HPC infrastructure is paramount.
This requires a deep understanding of error correction, algorithms, and how to orchestrate these complex systems between classical and quantum components at scale. Hands-on experience dramatically accelerates skill development, provides crucial insight into real-world challenges, and ultimately, boosts quantum computing readiness across the ecosystem. It allows participants to move beyond theoretical papers to truly internalise the effort involved.
“The near-term research priority for the QSC is to integrate the unique demands of QEC with our HPC systems to amplify the impacts of quantum computers for scalable scientific applications,” said Travis Humble, Director of the QSC.
From 10-12 February 2026, the ‘Riverlane Quantum Error Correction Distance-3 Surface Code Experiment Workshop’ was hosted at ORNL. This three-day in-person technical workshop brought together participants from the QSC, a diverse group with expertise spanning quantum computing, HPC, and computer science, with two of the center’s partners: full-stack superconducting quantum computing leader, IQM Quantum Computers, and QEC leader, Riverlane.
The workshop’s core focus was running a full distance-3 surface code memory experiment on real quantum hardware. This specific end-to-end experiment is a prototypical QEC demonstration bringing together simulation, modeling, software and hardware together. In QEC, an “experiment” means a concrete sequence of state preparations, gate operations, measurements, and classical postprocessing run on a quantum device (or simulated device) to test, characterise, or demonstrate an error-correcting code, its components, or its performance.
During the workshop, this end-to-end flow covered everything from qubit fundamentals and hardware benchmarking and calibration to circuit construction, running experiments on IQM quantum hardware, and offline decoding and detailed results analysis.
“Everyone came to the workshop with specialised expertise, be it HPC, quantum or QEC, but often there’s a gap in the end-to-end understanding of each specialism. This workshop closed that gap, enabling participants to envision how classical and quantum systems can truly work together to run complex experiments,” said Abe Asfaw, head of QEC enablement at Riverlane.
The agenda included:
Day 1 – Foundations and hardware: Focused on qubit fundamentals, circuits, measurements, IQM hardware benchmarking and calibration, an introduction to QEC and surface codes, and Riverlane’s approach to real-time decoding.
Day 2 – Distance-3 surface code in practice: Involved step-by-step construction of a distance-3 memory experiment, running circuits on IQM hardware, and offline decoding and results analysis, with significant hands-on participant execution.
Day 3 – Real-time QEC and Deltaflow: Explored the differences between offline and real-time decoding, delved into Riverlane’s real-time QEC technology, Deltaflow, including its architecture and QEC data flow. This also included hands-on decoding using Deltaflow, followed by a broader discussion of QEC experiments within research.
A key aspect of the workshop was demonstrating the joint value created by integrating cutting-edge quantum hardware with advanced software tools. IQM’s hardware enabled live experiments, providing a tangible platform for participants. ORNL recently received an IQM system and is currently completing conditioning and acceptance testing, making this workshop a timely introduction to their future experiments.
The workshop also provided direct exposure to real hardware constraints, noise, and performance behaviour. Participants not only saw how a decoder works but also learned about QEC-specific issues in calibrating quantum computers, to interpret results from an error correction experiment, and to recognise the symptoms of excessive noise. This practical engagement moved beyond theoretical models, enabling participants to address the real engineering challenges of implementing QEC.
While offline decoding is essential today for learning and benchmarking, the discussions naturally progressed to the future. As quantum systems scale, real-time QEC becomes critical for achieving utility-scale quantum computing.
Riverlane’s Deltaflow was highlighted as key to enabling this transition, providing the architecture and data flow necessary for continuous, on-the-fly error correction.
Participants learned how to build error correction experiments from scratch, utilizing the open-source Qrisp language. Deltakit, Riverlane’s SDK for Quantum Error Correction, also played a pivotal role, facilitating circuit construction, simulation, and offline decoding and analysis.
Abe explained that Deltakit “makes life incredibly easy for doing error correction experiments” by providing intuitive abstraction layers. “This allowed the participants to understand the intricacies of quantum error correction without getting bogged down in
low-level details about the circuit implementations, while fully acknowledging that understanding these circuit implementations is critical as a learner of QEC.”
The workshop also addressed the crucial question of integrating these systems with existing HPC infrastructure at national labs. Participants asked how their HPC systems could be leveraged within the overall orchestration framework of a utility-scale quantum computer. Such discussions underscored the participants’ rapid transition from arriving at the workshop with a foundational understanding of QEC to proactively pursuing operational QEC, laying the groundwork for a future in which HPC and quantum computers are seamlessly integrated.
The hands-on nature of the workshop delivered many tangible benefits to the participants. They successfully built and ran a distance-3 surface code memory experiment, executing circuits directly on live IQM hardware. This allowed them to observe firsthand the impact of QEC on logical qubits.
“Building and running a QEC experiment yourself on real hardware is the only way to really understand what QEC does, the amount of data it produces, and what it takes to process that data,” said Danny Bulmash, Education Lead at IQM.
The feedback indicated that the workshop was highly effective in helping participants synthesise complex information and confront the nuances of QEC implementation. They measured logical error probabilities and defect rates, discovering that many practical nuances only become apparent during implementation. The experience shifted their perspective from theory to handson mastery, cultivating a deeper appreciation of what is required to ensure the system performs reliably.
This workshop represents a significant step towards making QEC practical and reproducible. National labs are positioned as scalable infrastructure testbeds, and by engaging with Riverlane’s QEC systems and IQM’s hardware, they are laying the groundwork for optimising hybrid quantum-HPC systems through practical, error-corrected workflows.
Beyond their technical achievements, this workshop also helped shape the next generation of the quantum workforce. It convened graduate students from the QSC university members, researchers from Los Alamos National Lab and HPE, to name a few, and experts from Riverlane and IQM. Such collaboration fosters essential learning and skill development across academia, industry, and government, providing diverse talents with the shared knowledge needed to build useful quantum computers.
Special thanks to Travis Humble, director of the QSC, and Suzanne Parete-Koon, computing and computation science division workforce development lead at the QSC, for bringing together this talent from across the labs. The DOE’s push to unite national labs to build useful quantum computers is the driving force behind these crucial interactions, ensuring the quantum ecosystem continues to grow and innovate.
We’re looking forward to more workshops and collaborations as our quantum computers continue to scale. If your organization is interested in hosting a QEC workshop, please contact us at <abraham.asfaw@riverlane.com>.
Search faster—hit Enter instead of clicking.