

Collaboration is at the core of any important technological development. From the steam engine to the internet, humanity’s innovations interweave themselves between seemingly disparate communities.
That said, new technologies don’t always work together. There are many who still remember how Mac floppy disks were incompatible with PC machines, and vice versa.
Quantum computing is no different, which is why ԹϺ is a founding member of the new announced today by the Linux Foundation. The QIR alliance is working hard to ensure this technology reaches its full potential.
The siloed nature of early quantum computing developments has protected vital intellectual property, but it has also created a separation of resources. Quantum software from one organization may not work on the hardware of another, which can be an enormous obstacle for researchers.
The QIR Alliance is solving this problem by establishing an intermediate representation to enable interoperability within the quantum ecosystem. Based on the open source intermediate language, the QIR Alliance will create a standard set of rules for representing quantum constructs consistent with LLVM data model.
In doing so, the QIR Alliance hopes to enable wider collaboration and a quantum community built around principals of interoperability.
Although programming languages may look like machine speak to the untrained eye, these languages are for the human programmers. Intermediate representation approach splits the compilation process into two parts. A user language compiler converts human-readable program representation into IR. A hardware-specific compiler takes the IR and converts it into a set of machine-level instructions that the computer can understand.
This approach allows a hardware-specific compiler to work with many different source languages and still give the machine adequate instructions that it can comprehend. Conversely, quantum programming language developers only need to compile their new languages to one IR representation to run on many different machines. This enables innovation on both sides of the ecosystem while avoiding duplication of effort.
Therefore, a compiler-level solution makes sense to achieve the collaborative goals the QIR Alliance has set out.
LLVM is a collection of compiler and toolchain technologies that are designed around a language-independent intermediate representation. This common platform allows many source languages to share optimizers and executable generators, which enables a large amount of re-use in compiler machinery.
In short, this should allow quantum hardware to work with more varieties of software than they previously could. Rather than having to rewrite software based on the specific machine researchers want to use, the QIR Alliance will allow much more collaboration from previously disparate organizations.
An additional interesting part of LLVM is that it also facilitates integration with many languages and tools built for classical computation environments. While quantum and classical computers may seem like competing technologies, many researchers expect to see quantum and classical computing resources working together in the future. The use of LLVM will facilitate quantum and classical computations interaction at the hardware level.
For an organization like ԹϺ, the QIR Alliance offers several enticing advantages.
To begin, this initiative will benefit the current quantum ecosystem. As the reality of quantum machines begins to truly materialize, it is no longer feasible for researchers to work with systems that are not interoperable. Much like how Mac floppy disks were once not compatible with PC machines, the quantum industry will need to come together to create a valuable product for the consumer.
On top of this, the quantum sector must be constantly looking to the future and how this technology could improve and change in the coming years. All the major players within the quantum ecosystem must adopt a forward-thinking approach to intermediate representation that will fulfill the needs of current machines while also staying mindful of yet-to-be-developed hardware.
Keeping an eye on the horizon is a goal of the QIR Alliance, and ԹϺ is fortunate to be a part of such an important step in quantum computing’s history.
ԹϺ, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. ԹϺ’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, ԹϺ leads the quantum computing revolution across continents.
Progress in quantum computing is measured by hardware advances plus the algorithms and quantum error-correction codes that turn quantum systems into useful computational tools.
Thanks to recent hardware advances, researchers are increasingly sharpening their tools to probe the performance of quantum algorithms and understand how they behave in realistic conditions – where stability, system architecture and algorithm design all shape performance.
A new Denmark-based collaboration between the University of Southern Denmark (SDU), ԹϺ, and the Danish e-Infrastructure Consortium (DeiC) will utilize ԹϺ Helios. Researchers at the SDU’s Centre for Quantum Mathematics, led by Jørgen Ellegaard Andersen, will use Helios to pursue research into topological quantum computing.
Their work could help explain how and why successful quantum algorithms perform as they do, informing the development of high-performance algorithms suited to emerging quantum systems. They’re exploring the scientific foundations that support future quantum applications across areas including pharmaceuticals, finance, and defense.
“We are thrilled to gain access to ԹϺ’s high-fidelity Helios system. This collaboration gives us a unique opportunity to test the limits of our algorithms and evaluate system performance, while advancing fundamental research and laying the foundation for future applications.”
— Professor Jørgen Ellegaard Andersen, Director of the Centre for Quantum Mathematics at University of Southern Denmark
Topological quantum computing is an area of research that connects quantum computation with deep mathematical structures. It includes the study of error correcting codes known as surface codes that encode quantum information in the global properties of systems of logical qubits.
The research team will explore how these codes behave, and how they may support the development of fault-tolerant quantum algorithms in practical implementations under realistic conditions.
This distinction between theory and practical implementation matters. In theory, topological approaches offer a rich framework for designing algorithms and error-correcting codes. In practice, researchers need to understand how those ideas perform when implemented on real systems, where questions of noise, stability, overhead, and scaling become central. The collaboration will allow the SDU team to investigate these questions directly.
Beyond individual algorithms and codes, the research will also develop tools for benchmarking quantum processors. The goal is to develop new ways to characterize fidelity and stability in regimes that can be difficult to access.
The team will also explore hybrid quantum–classical approaches, including machine-learning techniques assisted by quantum hardware, to study the mathematical structures at the heart of topological quantum computing. This work reflects a broader field of research in which quantum and classical methods are used together, each contributing to parts of a computational problem.
The collaboration reflects the growing role of national quantum infrastructure in supporting research and talent development. Denmark has a long tradition of scientific innovation, and this collaboration is intended to support the country’s continued development in quantum technology.
The initiative is supported by DeiC, which played a central role in securing funding and enabling access to ԹϺ’s systems. DeiC has been assigned a particular role in developing and coordinating quantum infrastructure initiatives for the benefit of universities and industry, operating without its own commercial, sectoral, or geographical interests. This includes securing dedicated access to quantum computers, producing advisory services and supporting the development of new talent in the Danish quantum sector.
“DeiC’s special effort to secure funding and access for this research initiative is rooted in our organization’s role in relation to the Danish Government’s strategy for quantum technology.”
— Henrik Navntoft Sønderskov, Head of Quantum at Danish e-Infrastructure Consortium
This collaboration promises to accelerate the development of practical algorithms. It is grounded in fundamental science – but its focus is practical: discovering and testing mathematical approaches to topological quantum computing that can be implemented, evaluated, and improved on real quantum hardware.
That work requires both theoretical insight and access to a system such as Helios capable of supporting meaningful scientific work.

This month, ԹϺ welcomed its global user community to the first-ever Q-Net Connect, an annual forum designed to spark collaboration, share insights, and accelerate innovation across our full-stack quantum computing platforms. Over two days, users came together not only to learn from one another, but to build the relationships and momentum that we believe will help define the next chapter of quantum computing.
Q-Net Connect 2026 drew over 170 attendees from around the world to Denver, Colorado, including representatives from commercial enterprises and startups, academia and research institutions, and the public sector and non-profits - all users of ԹϺ systems.
The program was packed with inspiring keynotes, technical tracks, and customer presentations. Attendees heard from leaders at ԹϺ, as well as our partners at NVIDIA, JPMorganChase and BlueQubit; professors from the University of New Mexico, the University of Nottingham and Harvard University; national labs, including NIST, Oak Ridge National Laboratory, Sandia National Laboratories and Los Alamos National Laboratory; and other distinguished guests from across the global quantum ecosystem.
The mission of the ԹϺ Q-Net user community is to create a space for shared learning, collaboration and connection for those who adopt ԹϺ’s hardware, software and middleware platform. At this year’s Q-Net Connect, we awarded four organizations who made notable efforts to champion this effort.
Congratulations, again, and thank you to everyone who contributed to the success of the first Q-Net Connect!
Q-Net offers year‑round support through user access, developer tools, documentation, trainings, webinars, and events. Members enjoy many exclusive benefits, including being the first to hear about exclusive content, publications and promotional offers.
By joining the community, you will be invited to exclusive gatherings to hear about the latest breakthroughs and connect with industry experts driving quantum innovation. Members also get access to Q‑Net Connect recordings and stay connected for future community updates.

In a follow-up to our recent work with Hiverge using AI to discover algorithms for quantum chemistry, we’ve teamed up with Hiverge, Amazon Web Services (AWS) and NVIDIA to explore using AI to improve algorithms for combinatorial optimization.
With the rapid rise of Large Language Models (LLMs), people started asking “what if AI agents can serve as on-demand algorithm factories?” We have been working with Hiverge, an algorithm discovery company, AWS, and NVIDIA, to explore how LLMs can accelerate quantum computing research.
Hiverge – named for Hive, an AI that can develop algorithms – aims to make quantum algorithm design more accessible to researchers by translating high-level problem descriptions in mostly natural language into executable quantum circuits. The Hive takes the researcher’s initial sketch of an algorithm, as well as special constraints the researcher enumerates, and evolves it to a new algorithm that better meets the researcher’s needs. The output is expressed in terms of a familiar programming language, like Guppy or , making it particularly easy to implement.
The AI is called a “Hive” because it is a collective of LLM agents, all of whom are editing the same codebase. In this work, the Hive was made up of LLM powerhouses such as Gemini, ChatGPT, Claude, Llama, as well as which was accessed through AWS’ Amazon Bedrock service. Many models are included because researchers know that diversity is a strength – just like a team of human researchers working in a group, a variety of perspectives often leads to the strongest result.
Once the LLMs are assembled, the Hive calls on them to do the work writing the desired algorithm; no new training is required. The algorithms are then executed and their ‘fitness’ (how well they solve the problem) is measured. Unfit programs do not survive, while the fittest ones evolve to the next generation. This process repeats, much like the evolutionary process of nature itself.
After evolution, the fittest algorithm is selected by the researchers and tested on other instances of the problem. This is a crucial step as the researchers want to understand how well it can generalize.
In this most recent work, the joint team explored how AI can assist in the discovery of heuristic quantum optimization algorithms, a class of algorithms aimed at improving efficiency across critical workstreams. These span challenges like optimal power grid dispatch and storage placement, arranging fuel inside nuclear reactors, and molecular design and reaction pathway optimization in drug, material, and chemical discovery—where solutions could translate into maximizing operational efficiency, dramatic reduction in costs, and rapid acceleration in innovation.

In other AI approaches, such as reinforcement learning, models are trained to solve a problem, but the resulting "algorithm" is effectively ‘hidden’ within a neural network. Here, the algorithm is written in Guppy or CUDA-Q (or Python), making it human-interpretable and easier to deploy on new problem instances.
This work leveraged the NVIDIA CUDA-Q platform, running on powerful NVIDIA GPUs made accessible by AWS. It’s state-of-the art accelerated computing was crucial; the research explored highly complex problems, challenges that lie at the edge of classical computing capacity. Before running anything on ԹϺ’s quantum computer, the researchers first used NVIDIA accelerated computing to simulate the quantum algorithms and assess their fitness. Once a promising algorithm is discovered, it could then be deployed on quantum hardware, creating an exciting new approach for scaling quantum algorithm design.
More broadly, this work points to one of many ways in which classical compute, AI, and quantum computing are most powerful in symbiosis. AI can be used to improve quantum, as demonstrated here, just as quantum can be used to extend AI. Looking ahead, we envision AI evolving programs that express a combination of algorithmic primitives, much like human mathematicians, such as Peter Shor and Lov Grover, have done. After all, both humans and AI can learn from each other.