

By Duncan Jones
In September, nearly 200 senior cybersecurity leaders from around the world convened to discuss the state of U.S. cybersecurity at the. Topics around cybersecurity were varied and included discussions about moral asymmetry of today’s global threat actors, lessons learned from Ukraine and general discussions around all things that “keep us up at night” concerning cyber threats.
As a speaker at the Summit, I wanted to take a moment to share my take-aways from an important discussion that took place during our breakout session, “Future of Encryption: Moving to a Quantum Resistant World.” My esteemed fellow panelists from NSA, NIST, CMU and AWS exchanged insights as to where U.S. government agencies stand in their preparation for current and future threats to encryption, the likely hurdles they face, and the resources that exist to assist in the transition. Those responsible for moving their agency to a quantum-resistant world should find the following insights worth considering.
With the prospect of powerful quantum computers breaking known encryption methods on the horizon and with federal mandate now in place, the good news is that quantum-proof encryption is finally being discussed. The not-so-good-news is that it isn’t clear to cybersecurity practitioners what they need to do first. Understanding the threat is not nearly as difficult as understanding the timing, which seems to have left agency personnel at the starting gate of a planning process fraught with challenges – and urgency.
Why is the timeline so difficult to establish? Because there is no way of knowing when a quantum-based attack will take place. The Quantum-safe Security Working Group of the Cloud Security Alliance (CSA) chose the date, April 14, 2030, to represent “Y2Q,” also known as “Q-Day” – the moment secure IT infrastructure becomes vulnerable to the threat of a fault-tolerant quantum computer running Shor’s algorithm. The Biden Administration based its implementation timeline on the day that NIST announced the four winning algorithms for standardization. Then there is the “hack now, decrypt later” timeline which suggests that quantum-related attacks may already be underway.
Regardless of the final timeline or potential drivers, one thing that was clear to the panel attendees was that they need to start the transition now.
I get this question often and was not disappointed when one attendee asked, “How can I convince my agency leadership that migrating to quantum-proof encryption is a priority when they are still trying to tackle basic cyber threats?”
The panelists responded and agreed that the U.S. government’s data storage requirements are unique in that classification dates are typically 20 years. This means that systems in development today, that are typically fielded over the next 10 years, will actually have a storage shelf life of 30 years minimum. Those systems need to be “future-proofed” today, a term that should be effective when trying to convince agency leaders of the priority.
The need to future-proof is driven by a variety of scenarios, such as equipment and software upgrades. In general, it takes a long time (and perhaps even longer for government entities) to upgrade or change equipment, software, etc. It will take an extremely long time to update all of the software that has cryptography in place.
The panelists also agreed that given the extensive supply chain supporting federal systems, vendors are a critical component to the overall success of an agency’s future-proofing for the quantum age. In 10-15 years, there will be some government partner/vendor somewhere who will not have transitioned to quantum-proof encryption. For leaders who have not yet prioritized their agency’s cryptography migration, let them ponder that thought — and start to focus on the need to prepare.
The panel shared several past technology migrations that were similar in their minds to the adoption of quantum computing.
Y2K was similar to the looming quantum threat by both the urgency and scale of the government’s need to migrate systems. However, without a deadline assigned to implementing the encryption migration, Y2K is really only similar in scale.
The panelists also recalled when every company had to hash function, but concluded that the amount of time, effort, and energy required to replace current encryption will be way more important than SHA-1 — and way more ubiquitous.
While previous technology migrations help to establish lessons learned for the government’s quantum-proof cryptography migration, the panel concluded that this go-round will have a very unique set of challenges — the likes of which organizations have never had to tackle before.
The consensus among panelists was that agencies need to first understand what data they have today and how vulnerable it is to attack. Data that is particularly sensitive, and vulnerable to the “hack-now, decrypt-later” attacks, should be prioritized above less sensitive data. For some organizations, this is a very challenging endeavor that they’ve never embarked upon before. Now is an opportune time to build inventory data and keep it up to date. From a planning and migration perspective, this is an agency’s chance to do it once and do it well.
It is important to assume from the start that the vast majority of organizations will need to migrate multiple times. Panelists emphasized the need for “crypto agility” that will enable future replacement of algorithms to be made easily. Crypto agility is about how easy it is to transition from one algorithm (or choice of parameters) to another. Organizations that prioritize long-term thinking should already be looking at this.
The panelists added that communicating with vendors early on in the planning process is vital. As one panelist explained, “A lot of our service providers, vendors, etc. will be flipping switches for us, but a lot won’t. Understanding what your priorities are for flipping the switch and communicating it to your vendors is important.”
Matt Scholl of NIST shared about the is doing to provide guidance, tips, and to answer questions such as what are discovery tools and how do I budget? The project, announced in July 2022, is working to develop white papers, playbooks, demonstrations, tools that can help other organizations implement their conversions to post-quantum cryptography. Other resources that offer good guidance, according to Scholl, include recent , DHS’and the .
One additional resource that has been extremely helpful for our CISO customers is ԹϺ’s The guide outlines what CISOs from any organization should be doing now and provides a basic transition roadmap to follow.
The discussion wrapped up with the acknowledgement that quantum has finally become part of the mainstream cybersecurity discussion and that the future benefit of quantum computing far outweighs the challenges of transitioning to new cryptography. As a parting thought, I emphasized the wonderful opportunity that agencies have to rethink how they do things and encouraged attendees to secure management commitment and funding for this much-needed modernization.
I want to give a special thanks to my fellow panelists for the engaging discussion: Margaret Salter, Director, Applied Cryptography, AWS, Dr. Mark Sherman, Director, Cybersecurity Foundations, CMU, Matthew Scholl, Chief of the Computer Security Division, ITL, NIST, and Dr. Adrian Stanger, Cybersecurity Directorate Senior Cryptographic Authority NSA.
ԹϺ, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. ԹϺ’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, ԹϺ leads the quantum computing revolution across continents.
Progress in quantum computing is measured by hardware advances plus the algorithms and quantum error-correction codes that turn quantum systems into useful computational tools.
Thanks to recent hardware advances, researchers are increasingly sharpening their tools to probe the performance of quantum algorithms and understand how they behave in realistic conditions – where stability, system architecture and algorithm design all shape performance.
A new Denmark-based collaboration between the University of Southern Denmark (SDU), ԹϺ, and the Danish e-Infrastructure Consortium (DeiC) will utilize ԹϺ Helios. Researchers at the SDU’s Centre for Quantum Mathematics, led by Jørgen Ellegaard Andersen, will use Helios to pursue research into topological quantum computing.
Their work could help explain how and why successful quantum algorithms perform as they do, informing the development of high-performance algorithms suited to emerging quantum systems. They’re exploring the scientific foundations that support future quantum applications across areas including pharmaceuticals, finance, and defense.
“We are thrilled to gain access to ԹϺ’s high-fidelity Helios system. This collaboration gives us a unique opportunity to test the limits of our algorithms and evaluate system performance, while advancing fundamental research and laying the foundation for future applications.”
— Professor Jørgen Ellegaard Andersen, Director of the Centre for Quantum Mathematics at University of Southern Denmark
Topological quantum computing is an area of research that connects quantum computation with deep mathematical structures. It includes the study of error correcting codes known as surface codes that encode quantum information in the global properties of systems of logical qubits.
The research team will explore how these codes behave, and how they may support the development of fault-tolerant quantum algorithms in practical implementations under realistic conditions.
This distinction between theory and practical implementation matters. In theory, topological approaches offer a rich framework for designing algorithms and error-correcting codes. In practice, researchers need to understand how those ideas perform when implemented on real systems, where questions of noise, stability, overhead, and scaling become central. The collaboration will allow the SDU team to investigate these questions directly.
Beyond individual algorithms and codes, the research will also develop tools for benchmarking quantum processors. The goal is to develop new ways to characterize fidelity and stability in regimes that can be difficult to access.
The team will also explore hybrid quantum–classical approaches, including machine-learning techniques assisted by quantum hardware, to study the mathematical structures at the heart of topological quantum computing. This work reflects a broader field of research in which quantum and classical methods are used together, each contributing to parts of a computational problem.
The collaboration reflects the growing role of national quantum infrastructure in supporting research and talent development. Denmark has a long tradition of scientific innovation, and this collaboration is intended to support the country’s continued development in quantum technology.
The initiative is supported by DeiC, which played a central role in securing funding and enabling access to ԹϺ’s systems. DeiC has been assigned a particular role in developing and coordinating quantum infrastructure initiatives for the benefit of universities and industry, operating without its own commercial, sectoral, or geographical interests. This includes securing dedicated access to quantum computers, producing advisory services and supporting the development of new talent in the Danish quantum sector.
“DeiC’s special effort to secure funding and access for this research initiative is rooted in our organization’s role in relation to the Danish Government’s strategy for quantum technology.”
— Henrik Navntoft Sønderskov, Head of Quantum at Danish e-Infrastructure Consortium
This collaboration promises to accelerate the development of practical algorithms. It is grounded in fundamental science – but its focus is practical: discovering and testing mathematical approaches to topological quantum computing that can be implemented, evaluated, and improved on real quantum hardware.
That work requires both theoretical insight and access to a system such as Helios capable of supporting meaningful scientific work.

This month, ԹϺ welcomed its global user community to the first-ever Q-Net Connect, an annual forum designed to spark collaboration, share insights, and accelerate innovation across our full-stack quantum computing platforms. Over two days, users came together not only to learn from one another, but to build the relationships and momentum that we believe will help define the next chapter of quantum computing.
Q-Net Connect 2026 drew over 170 attendees from around the world to Denver, Colorado, including representatives from commercial enterprises and startups, academia and research institutions, and the public sector and non-profits - all users of ԹϺ systems.
The program was packed with inspiring keynotes, technical tracks, and customer presentations. Attendees heard from leaders at ԹϺ, as well as our partners at NVIDIA, JPMorganChase and BlueQubit; professors from the University of New Mexico, the University of Nottingham and Harvard University; national labs, including NIST, Oak Ridge National Laboratory, Sandia National Laboratories and Los Alamos National Laboratory; and other distinguished guests from across the global quantum ecosystem.
The mission of the ԹϺ Q-Net user community is to create a space for shared learning, collaboration and connection for those who adopt ԹϺ’s hardware, software and middleware platform. At this year’s Q-Net Connect, we awarded four organizations who made notable efforts to champion this effort.
Congratulations, again, and thank you to everyone who contributed to the success of the first Q-Net Connect!
Q-Net offers year‑round support through user access, developer tools, documentation, trainings, webinars, and events. Members enjoy many exclusive benefits, including being the first to hear about exclusive content, publications and promotional offers.
By joining the community, you will be invited to exclusive gatherings to hear about the latest breakthroughs and connect with industry experts driving quantum innovation. Members also get access to Q‑Net Connect recordings and stay connected for future community updates.

In a follow-up to our recent work with Hiverge using AI to discover algorithms for quantum chemistry, we’ve teamed up with Hiverge, Amazon Web Services (AWS) and NVIDIA to explore using AI to improve algorithms for combinatorial optimization.
With the rapid rise of Large Language Models (LLMs), people started asking “what if AI agents can serve as on-demand algorithm factories?” We have been working with Hiverge, an algorithm discovery company, AWS, and NVIDIA, to explore how LLMs can accelerate quantum computing research.
Hiverge – named for Hive, an AI that can develop algorithms – aims to make quantum algorithm design more accessible to researchers by translating high-level problem descriptions in mostly natural language into executable quantum circuits. The Hive takes the researcher’s initial sketch of an algorithm, as well as special constraints the researcher enumerates, and evolves it to a new algorithm that better meets the researcher’s needs. The output is expressed in terms of a familiar programming language, like Guppy or , making it particularly easy to implement.
The AI is called a “Hive” because it is a collective of LLM agents, all of whom are editing the same codebase. In this work, the Hive was made up of LLM powerhouses such as Gemini, ChatGPT, Claude, Llama, as well as which was accessed through AWS’ Amazon Bedrock service. Many models are included because researchers know that diversity is a strength – just like a team of human researchers working in a group, a variety of perspectives often leads to the strongest result.
Once the LLMs are assembled, the Hive calls on them to do the work writing the desired algorithm; no new training is required. The algorithms are then executed and their ‘fitness’ (how well they solve the problem) is measured. Unfit programs do not survive, while the fittest ones evolve to the next generation. This process repeats, much like the evolutionary process of nature itself.
After evolution, the fittest algorithm is selected by the researchers and tested on other instances of the problem. This is a crucial step as the researchers want to understand how well it can generalize.
In this most recent work, the joint team explored how AI can assist in the discovery of heuristic quantum optimization algorithms, a class of algorithms aimed at improving efficiency across critical workstreams. These span challenges like optimal power grid dispatch and storage placement, arranging fuel inside nuclear reactors, and molecular design and reaction pathway optimization in drug, material, and chemical discovery—where solutions could translate into maximizing operational efficiency, dramatic reduction in costs, and rapid acceleration in innovation.

In other AI approaches, such as reinforcement learning, models are trained to solve a problem, but the resulting "algorithm" is effectively ‘hidden’ within a neural network. Here, the algorithm is written in Guppy or CUDA-Q (or Python), making it human-interpretable and easier to deploy on new problem instances.
This work leveraged the NVIDIA CUDA-Q platform, running on powerful NVIDIA GPUs made accessible by AWS. It’s state-of-the art accelerated computing was crucial; the research explored highly complex problems, challenges that lie at the edge of classical computing capacity. Before running anything on ԹϺ’s quantum computer, the researchers first used NVIDIA accelerated computing to simulate the quantum algorithms and assess their fitness. Once a promising algorithm is discovered, it could then be deployed on quantum hardware, creating an exciting new approach for scaling quantum algorithm design.
More broadly, this work points to one of many ways in which classical compute, AI, and quantum computing are most powerful in symbiosis. AI can be used to improve quantum, as demonstrated here, just as quantum can be used to extend AI. Looking ahead, we envision AI evolving programs that express a combination of algorithmic primitives, much like human mathematicians, such as Peter Shor and Lov Grover, have done. After all, both humans and AI can learn from each other.