Quantum computing is taking the next step towards enterprise adoption. Let´s start measuring what matters

14 Apr 2026
12 min read

As World Quantum Day approaches on April 14, it is a good opportunity to take a step back and reflect on recent developments and market dynamics. Over the last few months, we have seen several SPAC announcements, advancements in error-correction theory, use-case demonstrations, and announcements of new government programs. What all of these points have in common is a stronger conviction that quantum advantage is approaching faster than we think. This means we, as a whole industry, need to get ready for enterprise adoption and scaling up business models.

We have been building IQM since day one to prepare for this scenario. This means we have developed our technology, production capabilities, and business model to support a growing demand for quantum computers. We strongly believe that in these early days of the industry, true ownership and ecosystem models will be the key driver towards quantum advantage. This is why the choices being made right now, by institutions, by governments, and by companies like ours, will determine whether quantum computing fulfills its actual potential or just produces more years of impressive demonstrations.
Every transformative technology goes through the same challenging middle phase. The technology works in science labs. The papers are impressive. The roadmaps look credible. But the gap between what is demonstrated and what is deployed stays wide open.

We have been in that phase with quantum computing for some years. And the honest reason it persists is not only the hardware or the algorithms. The hardware is improving and the algorithms are getting more efficient. A strong reason is that the industry has not yet agreed on what it means for quantum advantage to actually arrive. We have been too focused on qubit count, fidelities, and connectivity and ignored main questions like adoption models, ownership, and deployment. If customers want to solve real-world problems, they must be able to integrate the new technology into their existing workflows and technology stacks. This is more important than achieving a certain qubit milestone. But not each business model allows you to fully integrate a new technology into your workflow.

 

The solution is not only the technology. It is the model.

A widely used commercial model in quantum computing today is cloud access. You pay to run jobs on hardware you do not own, accessed remotely, at a cost that scales with usage. IQM offers cloud access too, and it serves a real purpose. It lets researchers experiment, developers test algorithms, and organizations take their first steps with quantum without a capital commitment. That is genuinely valuable and we support it because it lowers the entry barrier for early quantum adoption. It is also essential for quantum education, a topic that is close to my heart.

But cloud access is an entry point, not a destination. And this is not just an argument about where quantum computing is today. It is an argument about where it is going. Even as the technology matures, even as fault-tolerant systems arrive, most serious institutions will want to run their most sensitive and strategically important workloads on infrastructure they control themselves. Intellectual property, data security, regulatory compliance, sovereign capability: these considerations do not disappear as the hardware gets better. If anything, they become more pressing as quantum becomes more powerful.

Think about how the internet scaled. The protocols that built it, TCP/IP most importantly, were open standards. Not proprietary. Not controlled by any single vendor. That openness created the conditions for an ecosystem where anyone could build on top of it and no single company could capture all the value or determine who was allowed to participate. The internet did not succeed because one company rented access to the network. It succeeded because the network became infrastructure that institutions could own, operate, and build upon independently.

Quantum computing is approaching the same fork in the road. The question being decided right now, in procurement decisions at national labs, supercomputing centers, and, most importantly, enterprise customers, is whether quantum follows the internet model or a different one entirely.

 

Who pays the price for slow adoption? Everyone!

When institutions stay only with cloud access and never move to ownership, several things will not happen: They do not build internal expertise. They do not develop the operational capability to run workloads at low latency. They do not generate the feedback loop between hardware and application that drives real progress. And they remain dependent on a vendor’s uptime, pricing, and continued interest in serving them.

The result is that quantum adoption looks wide but runs shallow. Many organizations have run experiments. Very few have built capabilities. And the cost of that gap is not abstract. Drug discovery timelines stay longer than they need to be. Supply chain optimization remains approximate rather than precise. Energy grid modeling stays computationally limited at exactly the moment when the energy transition demands more from it. These are not future problems. They are present ones, and quantum computing is one of the most credible tools we have for addressing them at scale. Every year that adoption stalls is a year those problems compound.

This pattern is known well from other deep tech cycles. Early industrial computing ran on time-sharing models. You booked time on a mainframe you did not own. It worked, until organizations realized that owning compute gave them fundamentally different capabilities: control over their data, ability to customize, and the compounding advantage of building institutional knowledge over time. The transition from time-sharing to owned infrastructure was not just a procurement decision. It was what made computing a real industry.

We believe quantum is at that inflection point. And we built IQM around that belief from the beginning.

 

Own the machine. Own the outcome. Build an ecosystem.

IQM builds full-stack, quantum computers for supercomputing environments. Our customers own the hardware. It sits in their facility, runs on their infrastructure, and operates under their control. We call this Production Quantum, because it reflects what we are actually building toward: quantum computing as a permanent, operational part of an institution’s technical capability. Not a remote service you subscribe to, but infrastructure you command, with the security, latency, and sovereign control that serious institutions require, today and long after the technology reaches its full potential.

Full-stack matters here. IQM designs, fabricates, and assembles its own superconducting chips. We own the supply chain from the quantum processor up through the software stack. That vertical integration is not incidental. It means our customers deal with one partner across the entire system, with one accountability structure, one support relationship, and one roadmap conversation. In technology, there is an old saying: one throat to choke, one back to pat. We built IQM to be that trustful and long-term partner.

Vertical integration also means supply chain resilience. Quantum hardware depends on specialized components, specialized fabrication processes, and materials that are not widely available. Companies that rely on external chip suppliers inherit that supplier’s bottlenecks, lead times, and strategic priorities. IQM controls its own chip fabrication. That gives our customers a more predictable path to delivery, upgrade, and long-term system evolution, which matters enormously when you are building a capability rather than running an experiment.

We co-design systems with customers from the beginning, aligning the hardware architecture with the actual workloads they need to run. Our systems integrate into HPC environments as accelerators, scheduled alongside classical compute, within the same infrastructure our customers already operate. From day one, even when almost everyone at IQM was a physicist, we said: we are going to sell what we build. Those early sales taught us something essential about what institutions actually need from quantum infrastructure, and we have been building to that standard ever since.

Today, IQM has delivered more quantum systems than any other manufacturer globally. That reflects a consistent commercial choice to pursue production deployments over benchmark announcements like qubit count or connectivity. We deploy systems in time, in spec to empower our customers to build their own solutions and ecosystems around them.

 

A rising tide lifts all boats. We intend to be the tide.

The second thing we believe, just as strongly, is that no single company can build the quantum industry alone. The question is whether the companies at the center of the field act like platforms or gatekeepers. We have made a clear choice to provide platforms and enable ecosystems around us and around our customers.

IQM open-sourced KQCircuits, our quantum processor design software, in 2021. It is a professional tool we built for our own chip development and gave to the entire community. That was deliberate. We believe the quantum ecosystem grows faster when its foundations are shared, and that a broader ecosystem ultimately creates more demand for what we build, not less.

The evidence supports this. In Finland, where IQM is headquartered, the quantum ecosystem grew from one company in 2018 to eleven by 2024. External funding attached to the ecosystem grew from zero to hundreds of millions of dollars over the same period. In Bavaria, where IQM also operates, a similar pattern has emerged: more companies, more employees, more capital, more activity than in comparable regions where quantum development has stayed more closed. On-premises quantum systems act as seeds. They attract software developers, algorithm researchers, and application companies. They train the engineers the industry needs. They create the feedback loops that accelerate hardware development. Open tooling amplifies all of that.

We build open and transparent quantum systems that institutions can operate directly, enabling hands-on use, long-term capability building, and full control over their quantum infrastructure. That is the actual mechanism by which quantum computing moves from a specialized research activity to a general-purpose industrial capability. We are not trying to capture the whole value chain. We are trying to grow it.

 

What this means for humanity, not just the industry

I am a physicist. I came to entrepreneurship through science, not through business school. And I think my scientific background shapes how I look at what we are building and why it matters beyond the commercial case.

There are problems that classical computers will never solve. Not because we have not tried hard enough, but because the computational complexity of certain problems scales in ways that make them structurally intractable on classical hardware. Molecular simulation for drug discovery. Optimization at industrial scale in logistics, energy, and finance. Materials design for next-generation batteries and semiconductors. These are problems where meaningful progress could reduce human suffering, lower the cost of energy, and accelerate scientific discovery in ways that compound over decades.

 

Quantum computing is the most credible path to making those problems tractable. The mathematics is clear. What remains hard is the engineering, the deployment, and the ecosystem development required to turn that mathematical potential into operational reality. The model we chose at IQM is not just the right commercial strategy. It is also the model most likely to produce the breadth of deployment that makes those broader applications possible.

There is also a talent dimension that does not get enough attention. There is a shortage of quantum physicists and engineers today. Real hardware in institutions is what educates the next generation. You cannot build a quantum workforce on cloud access alone. You need systems in labs, in universities, in national facilities, operated by people who develop genuine hands-on expertise. Every system we deliver is, in that sense, also an investment in the human capital the field needs to fulfill its potential.

 

What we have built. What comes next.

IQM systems are deployed across Europe, Asia, and now North America. We recently completed our first private enterprise sale globally, to Galaxy in Poland. Our system is on-site at Oak Ridge National Laboratory in the United States, one of the most demanding HPC environments in the world. We have announced our first U.S. Quantum Technology Center in College Park, Maryland. We are preparing to become the first European quantum company listed on a major U.S. stock exchange.

None of this makes the hard problems easier. Quantum hardware is still technically fragile. The road to fault-tolerant systems at scale is long. I have no interest in pretending otherwise.

But I am confident in the direction. The quantum era does not begin when the technology works perfectly in a lab. It begins when institutions own it, operate it, and build on it. That is what production quantum means. And on World Quantum Day, that is the progress worth measuring.

About the Author

Author Image
Dr. Jan Goetz
CEO & Co-Founder
Linkedin

Jan Goetz is CEO and Co-founder of IQM Quantum Computers, headquartered in Espoo, Finland. IQM has delivered more quantum systems than any other manufacturer globally and is preparing to become the first European quantum company listed on a major U.S. stock exchange.

Are you ready to change the future?

Turn your ideas into impact and connect with us today.

Search faster—hit Enter instead of clicking.