The risk of quantum to classical cryptography

Introduction

The landscape of cybersecurity stands at a critical juncture. For decades, classical cryptographic systems like RSA and Elliptic Curve Cryptography have been the backbone of digital security – protecting sensitive data, securing communications, and underpinning trust across industries. But the advent of quantum computing is poised to disrupt this foundation. Quantum computers have the potential to render many of today’s cryptographic schemes obsolete.

If cryptographic systems were to break under quantum attacks, the implications would be catastrophic. Confidential communications—whether they belong to governments, corporations, or individuals—could be exposed. Financial transactions, online banking, and digital identities would become vulnerable. Critical infrastructure, from power grids to healthcare systems, could be compromised.

What makes this threat particularly urgent is the “harvest now, decrypt later” strategy potentially already being employed by malicious actors. Adversaries are likely stockpiling encrypted data today, with the intent of decrypting it in the future when quantum capabilities mature. This means that even data encrypted under the strongest public key cryptography systems is at risk of exposure in the future.

The urgency of this situation is underscored by NIST’s warning: organizations must transition to quantum-resistant cryptography by 2035 at the latest. This isn’t just a challenge for cryptographers and security specialists—it’s a call to action for technical professionals across all sectors. The decisions made today will determine whether the digital world can withstand the quantum era or succumb to the risks it brings.

1. Understanding the Quantum Threat

Generally speaking, cryptographic schemes can largely be divided into two categories; symmetric cryptography and asymmetric cryptography (also called public key cryptography).

Public key cryptography

Our current digital security infrastructure relies heavily on public key cryptography, including systems like RSA and Elliptic Curve Cryptography. These systems derive their strength from mathematical problems that are computationally difficult to solve using classical computers.

In general, a public key encryption system has two components, a public key and a private key. Encryption works by taking a message and applying a mathematical operation to it, using the public key, to get a random-looking cipher. Decryption takes the random looking cipher and applies a different operation, using the private key, to get back to the original message. Signing works similarly, where the private key is used to create a signature, and the public key is used to verify it.

RSA

The RSA scheme is the most popular and best understood public key cryptography system. RSA’s security comes from the fact that while it’s easy to multiply two large prime numbers to generate a product, it’s extremely hard to reverse the process and determine the original primes, even with significant computational resources — this is known as the Integer Factorization Problem.

Factoring small numbers into primes is straightforward. For example, take the composite number 15: it’s the product of two primes, 3 and 5. Similarly, 35 factors into 5 and 7. The simplicity of these calculations lies in the small size of the numbers involved.

However, RSA relies on much larger numbers for security. Typical key sizes in RSA are as follows:

  • 2048 bits: This corresponds to a number roughly 220482^{2048} in size, or about 1061710^{617}. Factoring such a number, even with all the computational power available today, is infeasible.
  • 3072 bits: For higher security, especially against future advancements, this key size provides an even larger challenge for factorization.
  • 4096 bits: Used in highly sensitive contexts, this key size offers the most robust defense but comes with increased computational overhead.

These large key sizes have made RSA secure so far, because no known algorithms running on classical computers can efficiently factorize such enormous numbers.

Elliptic Curve Cryptography (ECC)

Elliptic Curve Cryptography ECC’s advantage lies in its efficiency: it provides the same level of security as RSA with much smaller keys, leading to faster computations, reduced storage requirements, and lower bandwidth usage. For example, an ECC key of 256 bits offers comparable security to a 3072-bit RSA key, making ECC particularly suitable for systems with constrained resources, like mobile devices and IoT.

Factoring small integers for RSA showcases the simplicity of its problem space, but for ECC, the analogous operation is solving discrete logarithms on elliptic curves, known as the Elliptic Curve Discrete Logarithm Problem (ECDLP). Unlike simple factoring, solving the ECDLP remains infeasible even for smaller elliptic curves, thanks to the intricate structure of elliptic curves.

Typical key sizes in ECC are as follows:

  • 256 bits: Equivalent in security to a 3072-bit RSA key. Widely used in practice, it balances strong security with excellent performance for encryption, decryption, and digital signatures.
  • 384 bits: Offers enhanced security, equivalent to a 7680-bit RSA key, and is commonly used in high-security environments.
  • 521 bits: Provides the highest security level, comparable to a 15,360-bit RSA key, for highly sensitive data, albeit with slightly higher computational costs.

ECC is increasingly favored for modern cryptographic applications because of its efficiency and strong security guarantees. Its small key sizes make it particularly well-suited for secure communication protocols, digital signatures, and systems requiring high performance with limited resources.

Symmetric cryptography

Unlike public key cryptography, symmetric cryptography relies on a single shared key for both encryption and decryption. Among the various symmetric encryption schemes, the Advanced Encryption Standard (AES) has become the global standard for securing data due to its balance of performance, flexibility, and robust security.

AES

AES supports key sizes of 128 bits, 192 bits, and 256 bits, each providing different levels of security:

  • AES-128: Offers strong security and is suitable for most applications, with computational power currently unable to break it in practical timeframes.
  • AES-192: Provides additional security at a slight computational cost.
  • AES-256: The highest level of security, recommended for extremely sensitive data or future-proofing against advancements in computational capabilities.

AES derives its security from several key factors:

  • Key Size: The large key sizes (128, 192, or 256 bits) mean that brute-forcing the key space would require  2128,2192,22562^{128}, 2^{192}, 2^{256} attempts, an astronomically large number even for the most powerful classical computers.
  • Multiple Rounds: AES performs multiple rounds of substitution, permutation, mixing, and key addition, making it resistant to cryptanalysis techniques.
  • Design Principles: AES’s design includes a lack of known backdoors, rigorous mathematical analysis, and resistance to side-channel attacks, such as timing or power analysis.

Quantum Computers

While a comprehensive examination of quantum information theory and computation extends beyond this article's scope, a technical overview of the key concepts is essential for understanding the cryptographic implications.

Quantum mechanics describes the behavior of matter and energy at the atomic and subatomic scales. At this level, classical physics breaks down and we observe phenomena like quantum tunneling, superposition, and entanglement. These quantum mechanical properties form the foundation of quantum computing, enabling computational approaches fundamentally different from classical methods.

In classical computing, information is encoded in bits, which exist in deterministic states of either 0 or 1. Quantum computing utilizes quantum bits (qubits), which can exist in a superposition of states, represented as a linear combination of |0⟩ and |1⟩ basis states. This superposition allows a qubit to encode exponentially more information than a classical bit. Furthermore, quantum entanglement enables qubits to be correlated in ways impossible for classical bits, allowing certain algorithms to achieve exponential speedup over their classical counterparts. The computational power of a quantum system scales exponentially with the number of qubits, unlike the linear scaling of classical computers with additional bits. This exponential scaling is what makes quantum computers particularly threatening to classical cryptographic systems.

Impact on classical cryptography

Quantum computers are not universally faster than classical computers but excel at solving certain types of problems that are computationally hard for classical systems. These include problems in optimization, simulation, and, most notably for cryptography, those involving factoring and unstructured search. Two quantum algorithms—Shor’s algorithm and Grover’s algorithm—stand out for their profound implications on classical cryptographic systems.

Shor’s Algorithm

Shor’s algorithms, developed by mathematician Peter Shor in 1994, provide efficient quantum solutions to the integer factorization problem and the discrete logarithm problem respectively. As we have seen, these two problems form the basis for the security in RSA and ECC respectively. In terms of RSA, Shor’s algorithm can factorize the large number used in RSA keys in polynomial time compared with exponential time for classical computers when run on a sufficiently large enough quantum computer. This would render RSA-encrypted data completely insecure. See Section 4 for an analysis of the current state of Shor’s algorithm for solving the integer factorization problem. Similarly, Shor’s algorithm can solve the Discrete Logarithm Problem, undermining the security of ECC based systems. In summary, Shor’s algorithm poses a risk to public key cryptography.

Grover’s Algorithm

With classical computers, the most efficient method for finding the correct item in an unstructured data set is to search through each item sequentially—this is known as brute force search. Grover's algorithm, developed in the mid-1990s, offers a quadratic speedup for this unstructured search problem. As noted above, AES symmetric cryptography relies on the computational infeasibility of brute force searching for the correct key when using sufficiently large key sizes. When running on an efficient quantum computer, Grover's algorithm would reduce the brute force time complexity by half. For example, an AES-128 key would have a security level of 2642^{64} instead of 21282^{128}, making it significantly weaker. Unlike Shor's algorithm, however, the impact of Grover's algorithm can be mitigated by increasing AES key lengths. Consequently, larger key sizes like AES-256 are increasingly recommended as a defense against Grover's algorithm. In summary, Grovers’s algorithm poses a risk to symmetric cryptography, but it is less catastrophic than the risk posed to public key cryptography by Shor's algorithm.

2. Harvest Now, Decrypt Later

Even if we upgrade our cryptographic systems to quantum secure cryptography before quantum computers arrive, there's still a significant risk: the "harvest now, decrypt later" (HNDL) attack. In this scenario, adversaries collect encrypted data today, storing it until they have access to quantum computers powerful enough to break the encryption. This is particularly concerning because, as of 2024, almost all internet communication relies on cryptography vulnerable to a quantum attack. As such vast amounts of encrypted data could theoretically be intercepted and stored until such a time that it can be decrypted. Most of today's encrypted data may be useless by the time it can be decrypted via a HNDL attack, e.g a one-time expiring password generated today is of no use in 5-10 years time. However, data such as state or diplomatic secrets, personal or corporate emails, or financial data may still remain sensitive even decades into the future.

While specific instances of such data harvesting are challenging to confirm due to the covert nature of these activities, there is substantial concern within the cybersecurity community that nation-state actors and cybercriminals are already engaging in this practice. Historical precedents like the Venona Project—where U.S. intelligence spent decades decrypting thousands of Soviet messages intercepted during World War II—demonstrate the lengths to which agencies are willing to go to decrypt historical data. Both NIST and the EU Agency for Cybersecurity (ENISA) have highlighted the risks associated with HNDL, urging immediate action to safeguard long-term data confidentiality.

3. Critical Systems at Risk

Whist we have discussed the impact that quantum computation has on classical cryptography, it’s helpful to further emphasize what day-to-day systems will be impacted by a breakdown in classical cryptography. This is by no means an exhaustive list, and in reality the true blast radius is much, much wider.

Web Traffic

The backbone of secure web communication relies on Transport Layer Security (TLS) — which, for the older readers, is an improved version of SSL. TLS ensures that data transmitted between a user and a website is encrypted, authenticated, and tamper-proof. This security layer is critical for almost all interactions we have with the internet, from online banking to instant messaging. TLS relies heavily on public key cryptographic systems like RSA and ECC.

Impact of broken cryptography on TLS

All web traffic becomes readable. Passwords, credit card numbers, personal and professional communication can all be intercepted and read by attackers. We would have no way to perform any confidential action over the internet.

On top of this, we would have no way to verify the authenticity of websites. Attackers could easily forge certificates and claim to be any website they wish, e.g. google.com. This would lead to wide spread phishing and credential theft. Without a way to verify the authenticity of websites, users have no assurances that they are interacting with legitimate services.

VPNs

Virtual Private Networks (VPNs) rely on cryptographic protocols to ensure the secure transmission of data over public and private networks. By encrypting traffic between a user and a VPN server, VPNs protect sensitive information such as browsing activity, login credentials, and communications from prying eyes. However, VPNs are only as secure as the cryptographic schemes they use, which often include RSA, ECC, and AES for key exchange and encryption.

Impact of broken cryptography on VPNs

One of the main reasons for using a VPN is privacy — especially when accessing sensitive or private resources. For example, users rely on VPNs to hide activities from government surveillance or bypass restrictions. Once classical cryptography is broken, users would have no assurances that their browsing habits are not being tracked by malicious actors or government surveillance bodies. On top of this, all previously encrypted VPN traffic could be decrypted retroactively by attackers engaging in a HNDL style attack. This means that all sensitive historical data such as websites visited and any information exchanged would become accessible.

Cryptocurrencies

As of November 2024, the market cap of the largest cryptocurrency, Bitcoin, sits at $1.9 trillion. Blockchains are unique in that their security relies totally on classical cryptography, particularly ECC, to secure private keys and sign transactions. Every transaction ever made is stored immutably on the blockchain, creating a transparent and verifiable record of activity. However, this transparency also means that if the cryptography securing it is broken, the consequences would be catastrophic

Impact of broken cryptography on Cryptocurrencies

The primary security risk comes from the ability of a sufficiently powerful quantum computer to break ECC, which Bitcoin and many other cryptocurrencies depend on. As discussed, Shor’s algorithm, when implemented on a quantum computer, could recover private keys from public keys, enabling attackers to steal funds and sign fraudulent transactions – undermining the trust and integrity of the entire blockchain.

The impact on cryptocurrencies would extend beyond financial loss. The very trust that underpins the blockchain system would be destroyed. Without significant advances in post-quantum cryptography and its integration into blockchain protocols, the future of cryptocurrencies in a quantum world is uncertain.

Messaging

Internet-based messaging platforms such as WhatsApp, iMessage, and Signal are essential tools for daily communication. These apps boast security through end-to-end encryption (E2EE), ensuring that only the sender and recipient can read the messages. E2EE relies on a combination of public key cryptography (e.g., RSA or ECC) for secure key exchanges and symmetric cryptography (e.g., AES) for encrypting the message content.

Impact of broken cryptography on messaging

Complete compromise of historical messages. Messages sent today that rely on current cryptographic systems could be decrypted retroactively by attackers using HNDL attacks. Sensitive personal, professional, or even diplomatic conversations could be exposed years into the future.

Loss of forward secrecy. Many messaging apps implement forward secrecy, where keys are ephemeral and never reused. This protects past sessions against future compromises of keys or passwords. However, if public key cryptography is broken, these ephemeral keys could be derived rendering forward secrecy ineffective and exposing past conversations.

Just as worrying would be the ability for an attacker to impersonate participants in a conversation in the absence of secure cryptography. Attackers could impersonate participants, alter messages, or intercept messages in transit leading to a complete loss of trust in digital communications.

Tip of the Iceberg

Unfortunately the critical systems listed above are merely the tip of the iceberg. Any digital information which requires a level of security or integrity is at risk if it’s secured with classical cryptographic methods. However, companies have begun the task of migrating to post quantum secure cryptography. For more information on these mitigation strategies and timelines see Sections 5 onwards.

4. Quantum computing progress and limitations

For as long as Quantum Computing has existed, theory has been far ahead of practice. Since the publication of Shor’s algorithm in 1996, integer factorization has become a textbook problem and application of quantum computers.

As part of this, integer factorization is often used as a benchmark for assessing a quantum computers capability. The challenge is to build a quantum computer large and fast enough to factorize large numbers at a speed which is impossible to classical computers. The large numbers of interest are the public keys used in RSA.

It should be noted that although using integer factorization as a measure of quantum capability is reasonable at a high level, it can be unhelpful for understanding the true state of play. In reality, improvements in other areas such as error correction, coherence times, and logical qubit count give a more accurate picture of quantum computing progress.

Integer factorization as a measure of quantum capability

Size of integers used in cryptography

As discussed, classical public key crypto systems like RSA derive their security from the difficulty of factoring large integers. It’s important to gauge just how large the integers used in these systems really are. Within RSA, two common key sizes are 2048 and 4096. A 2048 bit key, will produce an integer between 220472^{2047} to 220482^{2048}, or roughly 3×106163\times10^{616}, or an integer about 617 decimal digits long. For RSA 4096, this doubles to an integer roughly 1,235 digits long.

To put this into context, the number of grains of sand on earth is estimated to be approximately 102310^{23}, a 24 digit number. Pushing this further, the estimated number of atoms in the universe is about 108010^{80}, an 81 digit number. Thus, a 1,235 decimal digit RSA key is approximately 10^1153 times the estimated number of atoms in the observable universe.

State of play

Hybrid Approaches vs. Pure Quantum Methods

While pure quantum techniques like Shor’s algorithm have not yet been scaled to factorize large integers, hybrid methods that combine classical pre-processing with quantum computation have been explored with some success.

Pure Quantum Methods

  • As of November 2024, the largest number successfully factored using Shor’s algorithm on a quantum computer is 21 (factored into its prime factors 3 and 7). Clearly, not even close to factoring the numbers used in modern day encryption standards.
  • 2012: Researchers used a nuclear magnetic resonance (NMR) quantum computer to factorize 143 = 11 × 13 at room temperature using an algorithm called adiabatic quantum computation (AQC). This experiment demonstrated the capability of quantum systems to handle slightly larger numbers, though not with Shor’s algorithm. (Source)
  • 2024: A quantum annealer factored 8,219,999 using the D-Wave Pegasus architecture, without relying on hybrid techniques. This marks a significant step forward in demonstrating purely quantum capabilities for specific problem classes. (Source)

Hybrid Quantum Classical Methods

  • 2016: A D-Wave 2X quantum processor used quantum annealing to factorize the 18-bit number 200,099, though classical computation was heavily used to reduce the problem to fit within the quantum processor’s constraints. (Source)
  • 2019: Zapata Computing claimed to have factored 1,099,551,473,989, but their method depended on classical pre-processing to simplify the problem into a three-qubit quantum circuit. Notably, the classical component played a dominant role in the process, raising questions about the genuine contribution of the quantum component (Source).

Challenges with these methods

Critics have pointed out that many of the numbers factored using hybrid techniques are trivial for classical algorithms. For instance, 200,099 can be easily factored using Fermat’s factorization algorithm. Additionally, pre-processing steps in hybrid systems often reduce the problem size to such an extent that the quantum component plays a supporting role rather than driving the computation. Even the pure quantum methods discussed above do not offer the same speed up as Shor’s algorithm. Without a superpolynomial speedup over classical computers factoring, the numbers used by modern encryption standards will not be possible.

Technical and Practical Limitations

The slow progress in factoring larger integers using purely quantum methods is attributable to several key challenges, including:

  • Qubit Requirements: Shor’s algorithm demands a large number of high-quality qubits, far exceeding current capabilities. For example, some assessments indicate that approximately 20 million physical qubits would be necessary to factorize a 2048-bit RSA key within an 8-hour timeframe. More recent analyses suggest that, with optimized algorithms and error correction, the requirement could be reduced to around 13,436 qubits, achieving factorization in about 177 days (Gouzien et al.). These estimates are vastly bigger than the current largest quantum computer in existence with just over 1,000 qubits (IBM Condor).
  • Error Rates and Decoherence: Current quantum hardware struggles with maintaining qubit stability long enough to perform meaningful computations.

Recent Progress

Recent advancements in quantum computing have addressed some of these critical challenges in hardware performance and algorithmic efficiency, bringing us closer to practical applications, including the potential to break classical cryptographic systems.

Hardware Advancements

Gate Fidelity Improvements: Researchers have achieved two-qubit gate fidelities up to 99.98% using innovative designs like the double-transmon coupler. This high fidelity is essential for minimizing errors in quantum computations and advancing toward fault-tolerant quantum systems (SciTech Daily).

Error Correction: Google recently announced Willow. This latest quantum chip was shown to reduce errors exponentially as the chip was scaled up to more qubits. This breakthrough cracks a key challenge in quantum error correction that the field has pursued for almost 30 years.

Extended Coherence Times: Enhancements in qubit coherence times have been reported, reaching up to 1 millisecond. Longer coherence times allow quantum systems to perform more complex computations before decoherence effects introduce errors (Meetiqm).

Software Advancements

Algorithmic Optimization: Optimizations of Shor’s algorithm have been implemented using quantum computing frameworks like IBM’s Qiskit. These optimizations focus on reducing the depth and size of quantum circuits, making the algorithm more feasible on current quantum hardware (IEEE Xplore).

Quantum Error Correction (QEC): Developments in QEC codes, such as Low-Density Parity-Check codes, have significantly reduced the physical qubit requirements for running complex algorithms. This approach could enable Shor’s algorithm to factor large integers with fewer qubits than previously estimated (Interesting Engineering).

Major Architectures

Superconducting Qubits

Superconducting qubits are currently the most advanced and widely used type of quantum bit. They exploit the quantum properties of electrical circuits cooled to near-absolute zero temperatures. Key players include Google AI Quantum and IBM.

Neutral Atom Qubits

Neutral atom qubits use individual atoms as quantum bits, trapped and manipulated using lasers. Key players include QuEra and Pasqal.

Trapped Ion Qubits

Trapped ion qubits use charged atoms trapped in electromagnetic fields. Key players include IonQ, and Quantinuum.

Photonic Qubits

Photonic qubits use photons – particles of light – as quantum bits. Key players include PsiQuantum.

Summary

While quantum computing has made strides in demonstrating the theoretical feasibility of breaking classical cryptography, practical achievements are limited. Hybrid quantum-classical methods have extended capabilities but often rely heavily on classical computation, leaving pure quantum breakthroughs relatively modest.

The largest numbers factored with quantum computers remain far smaller than the integers used in real-world cryptographic systems, and the technical challenges of scaling quantum systems to meaningful sizes are formidable.

5. Assessing the Quantum Threat Timeline

Mosca’s Inequality

Historically, when an adversary develops a new offensive technology, the defender must implement countermeasures before that technology is completed. For example, if you know someone is building a 12-foot ladder to climb over your 10-foot wall, you can secure your home by adding a few feet to your wall at any time, provided it is before they finish building their ladder. This situation can be summed up by the following inequality:

Y(time required to upgrade your defenses)<Z(time required for the adversary to develop their attack)Y \, (\text{time required to upgrade your defenses}) < Z \, (\text{time required for the adversary to develop their attack})

If this inequality holds true, you are safe. However, if the << becomes a >>, they will be able to get over your wall.

When we frame the inequality in terms of quantum computing's threat to classical cryptography, it becomes:

Y(time needed to migrate away from classical cryptography)<Z(time until a quantum computer can break classical cryptography)Y \, (\text{time needed to migrate away from classical cryptography}) < Z \, (\text{time until a quantum computer can break classical cryptography})

However, due to HNDL attacks, this inequality does not guarantee security. Consider the following example:

💡

A credit card company estimates it will take three years to migrate away from classical cryptography, and that an adequate quantum computer will become available in five years. They write out the inequality and deem themselves safe:

Y(3 years)<Z(5 years)Y (\text{3 years}) < Z (\text{5 years})

However, credit cards often have expiration dates that are five years in the future. Any new credit cards generated and encrypted during the next three years—before the migration is complete—could be harvested and later decrypted while they are still valid.

Michele Mosca recognized this problem and introduced Mosca's Inequality, which factors in how long the underlying data needs to remain secret:

X(time the data must remain secret)+Y(time needed to migrate away from classical cryptography)<Z(time until a quantum computer can break classical cryptography)X \, (\text{time the data must remain secret}) + Y \, (\text{time needed to migrate away from classical cryptography}) < Z \, (\text{time until a quantum computer can break classical cryptography})

Unfortunately, each component of this inequality can be difficult to estimate.

How Long the Data Must Remain Secret: XX

Determining how long a credit card must remain confidential is straightforward because it has a set expiration date. However, most sensitive information doesn't have a discrete expiry date. Instead, its importance gradually diminishes over time. For example, secret information about the 2024 U.S. election may be extremely important right now, but will be less significant by 2040, and might only interest historians by 3040.

Time Needed to Migrate Away from Classical Cryptography: YY

Large scale software projects are notorious for being more difficult and costly than expected (examples). Considering that the entire internet currently relies on ECC, transitioning away from it will take years. Hardware often includes optimized operations for ECC, so switching the software alone may not be sufficient—organizations will also need to upgrade their hardware to prevent severe performance degradation for their users. Each system will have its own timeline, and the timelines will rarely be short.

Time Needed to Build a CRQC: ZZ

Of the three components of Mosca's Inequality, ZZ is the one that sparks the most debate, public interest, and often misinformation. It is extremely difficult to accurately estimate when the first cryptographically-relevant quantum computer (CRQC) will arise, but we can still gain some insight by looking at expert opinions.

Every year, the Global Risk Institute surveys dozens of quantum computing experts for their opinions on the timeline of the quantum threat. They publish these results in their Quantum Threat Timeline Report. The plot below shows their 2024 estimates of how many years it will be until a quantum computer can break RSA 2048 within 24 hours.

image

We can see that experts consider the 5-year timeline highly unlikely, with the majority assigning it a probability of less than 1%. At the 10-year mark, the majority still assign a probability of below 30%, or lower. Once we reach the 15-year timeline, a majority of experts assign a probability of either 50%, or higher. At 20 years, the majority estimates a probability of above 70%, or higher. Finally, on the 30-year timeline, half of the experts deem it above 95%, or higher.

6. Accelerators of Quantum Computing Development

While expert opinions offer valuable insights, making long-term predictions about complex topics is notoriously difficult. Various factors could accelerate progress – and shorten the expected timelines.

International Competition

For centuries, nations have sought to decrypt their adversaries' secret communications. With quantum computing's potential to break ECC, governments understand that whoever gains this capability first will have a significant advantage. Consequently, countries are pouring funding into quantum technologies, with China leading publicly announced funding as of 2023, at $15.3 billion (McKinsey). However, this does not account for any funding that may be allocated secretly.

image

While government entities haven't always led in computing technologies, they have often been ahead in cryptography and cryptanalysis. For example, a similar system to the RSA scheme was invented behind-closed-doors by GCHQ in 1973, four years before RSA was invented in MIT. Therefore, it is possible that intelligence agencies have made improvements to ECC-breaking quantum algorithms such as Shor's algorithm.

Private Investment

As quantum computers get closer to breaking ECC, governments interested in surveilling adversaries are likely to pour more money into the field to win the race to decryption. However, the amount of venture capital investment is less predictable due to the current uncertainty around commercial applications of quantum computers. If such commercial use cases are discovered, there could be immediate spikes in private funding, driving faster innovation.

Education

Quantum computing is a novel field that combines expertise in computing and physics, so breakthroughs often come from those who have PhD-level training. Many universities—including Oxford and Trinity College —are now offering quantum computing education at the undergraduate and master's levels. This expansion will increase the number of individuals pursuing careers in quantum computing, thereby adding more talent to tackle the most challenging problems.

AI

As AI continues to improve its mathematical and problem-solving capabilities, it may contribute to the discovery of new algorithms for error correction or quantum cryptanalysis. This potential for algorithmic discovery is illustrated by DeepMind’s AlphaTensor, which found a faster method for matrix multiplication. Google Quantum AI founder Hartmut Neven has emphasized the significant “cross-fertilization between AI and quantum computing.”

Historical Precedent

Over the past couple of centuries, technologies have often materialized much sooner than expected. Lord Kelvin, a towering figure in 19th-century physics, stated in 1895 that “Heavier-than-air flying machines are impossible”, only for the Wright brothers to achieve it just eight years later. Examples like this should keep us cautious about how much trust we place in experts’ opinions on timelines for quantum computing.

7. Responses and Mitigation Strategies

Post-Quantum Cryptography (PQC)

Researchers are standardizing cryptographic schemes that do not rely on the integer factorization or discrete logarithm problems, and instead rely on problems which are resistant to attacks by quantum computers. Two primary directions are lattice-based schemes and hash-based schemes.

New Hard Problems

RSA relied on the hard problem of factoring large numbers, while ECC depended on the discrete logarithm problem. Both were eventually found to be vulnerable to quantum algorithms. In response, cryptographers have been carefully selecting new hard problems on which to base post-quantum schemes. Lattice-based cryptography, currently the most popular option, relies on the hardness of the learning with errors problem.

NIST Standardization

Since 2016, NIST has been working to standardize PQC schemes. As of 2024, they have officially approved three schemes: two lattice-based schemes (ML-KEM and ML-DSA) and one hash-based scheme (SLH-DSA). NIST continues to evaluate additional candidates to mitigate the risk of unknown vulnerabilities in the currently approved schemes.

Hybrid Implementations

Some organizations remain cautious about newly standardized PQC schemes, but still seek improved post-quantum security. A practical approach is to employ a hybrid method that uses both ECC and a PQC scheme. In this scenario, an attacker would need to break both systems, significantly increasing the overall security.

Regulation

In theory, international institutions could agree not to engage in quantum cryptanalysis, or even to slow down quantum computing research until PQC is widely adopted. However, forging and enforcing such agreements would be difficult, primarily due to a lack of trust among major powers. Any deliberate slowdown would also come at a cost, as it would mean ignoring the potential benefits that quantum computing can bring.

Organizations Taking Action

Many leading technology providers have already implemented PQC in production environments. For example, Cloudflare and Chrome both use a hybrid key exchange that combines ML-KEM with a traditional ECC scheme. Messaging platforms are following suit, as demonstrated by iMessage and the Signal Protocol (used in Signal and WhatsApp) adopting similar PQC hybrid approaches.

Challenges in Transition

Performance Degradation

Post-quantum key exchanges and signatures typically involve larger data sizes than ECC-based schemes, leading to increased network latency and memory usage. While this performance impact may be negligible for most organizations, others may hesitate to adopt PQC until they have guarantees that it will not slow down their systems. For example, Chrome postponed its PQC rollout on Android due to concerns about performance degradation.

In addition, the cryptographic operations in most of these new schemes are not yet as fast as those in established ECC schemes. Over the years, ECC schemes have benefited from extensive hardware optimizations, resulting in highly efficient implementations. Although initial hardware optimizations are emerging for PQC schemes, their performance still lags behind that of ECC.

Bogus Implementations

Even if PQC schemes themselves are considered secure, implementation errors can introduce vulnerabilities. While cryptography packages usually receive thorough code reviews before release, niche packages may allow mistakes to slip through unnoticed.

Hard Problems Made Easy

PQC schemes cannot be proven quantum-secure. They are considered secure only as long as the underlying hard problem is considered difficult. Researchers continually attempt to simplify these problems, and if they fail over many years, the problem is effectively deemed “hard.” However, there is always a chance that a new method or breakthrough could make a hard problem easy, requiring another migration.

Lack of Trust

Some organizations may distrust NIST’s standards due to historical concerns about backdoors in cryptography. Such skepticism may compel these organizations to independently evaluate PQC schemes, which would take longer than simply following NIST’s recommendations.

Need for Widespread Adoption

Even if servers are prepared to adopt PQC schemes, the entire ecosystem including clients and other verification systems must be able to handle these new schemes. Without widespread compatibility, organizations cannot fully transition to post-quantum approaches.

8. The Urgency of Transition

Secrets That Don’t Expire

If the data transmitted by your organization has a short expiration date, you may not need to immediately transition to PQC. As long as you complete the transition before the arrival of a CRQC, the data you previously transmitted using classical cryptography will have already expired and lost its value.

However, if your organization’s data never fully loses its importance over time, there is a strong incentive to switch to PQC as soon as possible. The earlier you begin using PQC, the fewer secrets that may be decrypted while they remain relevant.

Catastrophic Risk

If the transition to PQC is not completed in time, trust in the internet itself could erode. Although we will not go into hypothetical scenarios, one can easily ponder the far-reaching impact if public confidence in online communication collapses.

At the business level, organizations handling sensitive information which doesn’t expire risk severe consequences if they delay adopting PQC. Compromised secrets could undermine credibility, reveal critical intellectual property, and inflict irreparable damage.

Given the potentially catastrophic risks for both society and businesses, we should prioritize the move to PQC as soon as possible.

The First Decryptions

Although a quantum computer capable of breaking RSA 2048 in under 24 hours may still be a decade away, smaller demonstrations along the way could still cause public panic. Let’s say in five years time some researchers use a quantum computer to break a 1024 RSA key in a computation that takes a month to run. While this still won’t be a practical threat to our systems, the public could easily extrapolate that we are running out of time. This could be particularly concerning for cryptocurrencies, which are heavily affected by public speculation.

NIST’s Guidelines

In the initial public draft of NIST’s Transition to PQC Standards, the agency recommends that all major digital signature and key-establishment schemes which rely on classical public key cryptography be disallowed after 2035. This might seem aggressive, given that experts assign a low probability to a CRQC appearing within 10 years. However, when considering the severe consequences of HNDL attacks, it is sensible to encourage organizations to begin the transition as soon as possible.

These recommendations align with a U.S. national security memorandum issued in 2022, which states: “the United States must prioritize the timely and equitable transition of cryptographic systems to quantum-resistant cryptography, with the goal of mitigating as much of the quantum risk as is feasible by 2035.” The 2035 target underscores just how seriously national security agencies regard the quantum threat to classical cryptography.

Conclusion

Quantum computers pose a critical threat to classical cryptography—particularly public key systems like RSA and elliptic curve cryptography. The internet heavily depends on these vulnerable systems for numerous essential services, including web traffic, VPNs, cryptocurrencies, and online messaging.

Although quantum computers have yet to implement the cryptography-breaking algorithms at a scale large enough to execute practical attacks, steady progress in both software and hardware suggests that a cryptographically relevant quantum computer could emerge within the next decade. Factors such as international competition, private investment, and advancements in AI may accelerate this timeline.

In response, organizations must proactively transition to post-quantum cryptography sooner rather than later. This process can be extensive, and require significant software and hardware upgrades. The urgency is particularly high for those transmitting long-lived secrets, as these are vulnerable to harvest-now, decrypt-later attacks. By adopting PQC early, organizations can mitigate risks and maintain public trust.