Looking back, we'll remember this week as the moment when the Overton window for quantum irreversibly shifted... The week quantum computing stopped being a theoretical future consideration and became a tangible risk that must be mitigated.

This shift was precipitated by two landmark papers that dropped within hours of each other, from two completely different research teams using two completely different hardware architectures that both arrived at the same conclusion: the resources required to break the cryptography securing digital assets are orders of magnitude lower than previously believed.

Here's what happened, what it means, and why I've been arguing that the industry needs to move now.

The Google/Stanford Paper

The first bombshell came from the Google Quantum AI team, Justin Drake of the Ethereum Foundation, and Dan Boneh of Stanford University. Their paper presents a new circuit for running Shor's algorithm against the Elliptic Curve Discrete Logarithm Problem (ECDLP), the specific math problem underpinning Bitcoin's signature scheme. 

Notably, the actual circuit wasn’t disclosed. Instead the authors published a zero-knowledge proof that the circuit met the requirements and compiled correctly. This implies (indeed, the paper explicitly says) that quantum computing has crossed the threshold from being a purely theoretical threat to something that demands tangible action today. 

The headline number: 1,200 to 1,450 logical qubits and approximately 70 million operations. For context, a contemporary paper by Chevignard et al. published this same month describes a circuit using a similar number of logical qubits but requiring hundreds of billions of gates. Google's circuit is roughly 10,000x more efficient in gate count.

Logical Resources

Figure 1 - Google’s new circuit for ECDLP dramatically shrinks the efficient frontier

Why does that matter? Because qubit quality matters just as much as quantity. If your system only needs to hold together for 70 million cycles instead of 100 billion, the engineering requirements become dramatically more forgiving. We've already demonstrated error rates close to what this circuit would need.

But the truly alarming finding is about speed. On a superconducting architecture with fast clock speeds (like Google's existing Willow machine), this circuit could execute in approximately 9 minutes. Bitcoin settles a block every 10 minutes. 

That one-minute margin means this isn't just a threat to dormant coins or Satoshi's stash. It's a threat to every live transaction in the mempool. Every UTXO. Every wallet. The entire supply of Bitcoin, now and forever.

Prob

Figure 2 - A “fast-clock” architecture running the ECDLP circuit could realistically attack the BTC mempool

Beyond Bitcoin, the Google paper exhaustively treated nearly every corner of the blockchain/cryptocurrency space. Smart contracts, stablecoins, zero-knowledge proofs, data availability systems for Layer-2s: all inventoried for elliptic curve cryptography dependencies that might be broken by Shor's algorithm.

The paper explicitly states that a fast clock CRQC could crack the 1,000 highest-net-worth Ethereum accounts (representing 20.5 million ETH) in less than nine days. 

The unambiguous recommendation of the authors: "blockchain systems cannot ignore [quantum computing] and the migration to post-quantum security should begin without delay."

Read my full summary of the Google paper here.

The Oratomic Paper

Just minutes later, Oratomic published a parallel breakthrough using neutral-atom hardware. By applying a new class of high-rate error-correcting codes (quasi-cyclic lifted product codes), they showed that Shor's algorithm can be executed at cryptographically relevant scales using approximately 10,000 to 22,000 reconfigurable atomic qubits, with a runtime ranging from months to days.

Neutral atom platforms are particularly well-suited for these higher-rate codes because of their dynamic reconfigurability and all-to-all qubit connectivity. This paper, arguably the more scientifically significant, also explicitly described a plausible architecture to actually build this machine, referencing prior work that demonstrated many of the individual components independently.

Oratomic Image

Figure 3 - Oratomic’s proposed neutral-atom architecture and comparison to prior resource estimates

Like Google, the Oratomic team (consisting of several prominent physicists from the California Institute of Technology) has significant intellectual credibility and scientific authority. Dolev Bluvstein (a Project Eleven Advisor) is a pioneer in neutral atom computing, Manuel Endres developed the largest neutral atom array, and John Preskill has been enormously influential in the field of quantum information science/error correction.

Despite the authority of the authors and the undeniable advance this paper represents, challenges remain. Real-time decoding of the new codes is an acknowledged bottleneck, and lattice surgery on large code patches is computationally infeasible at current scales.

But the trajectory over the past year is unmistakable, and this paper represents the most plausible path to a near-term cryptographically relevant quantum computer, albeit one that is mainly suited to long-range attacks on dormant or lost assets with exposed public keys.

Read my full summary of the Oratomic Paper here.

Two Tech Trees, One Target

The publication of these works has been recognized as a groundbreaking moment for the field of quantum computing. But they do not imply that a cryptographically relevant quantum computer exists today.

One of the things that makes this week so significant isn't just either paper in isolation. It's that two independent teams, using fundamentally different physics (superconducting circuits vs. neutral atoms), simultaneously demonstrated that the path to breaking real-world cryptography may be shorter than anyone projected.

This matters because it invalidates the assumption that quantum progress depends on a single engineering miracle. Superconducting, neutral-atom, photonic, and ion-trap architectures represent entirely different roadmaps, funding pipelines, and physics. Even though the challenges facing these individual approaches remain enormous, only one needs to succeed. And in the span of a single week, breakthroughs were made on two of them at once.

Small iterative improvements in physical fidelity, error correction, control architectures, and algorithm design are creating a feedback loop that compounds progress. Faster machines enable better error-correction research, which lowers the resource bar for the next generation, which accelerates timelines with nonlinear speed. This is the "nothing, then everything" dynamic that often defines technological progress.

Why This Changes the Calculus

For years, the industry's position on quantum has been some version of "we'll deal with it when it's real." Even serious observers believed the first threat was at least a decade out, and would  materialize initially as long-range attacks on dormant assets.

Both of those assumptions are now untenable.

The Google paper shows that on-spend attacks (intercepting live transactions) may be possible with a fast-clock quantum computer. That changes the threat model completely. It's no longer about whether Satoshi's coins are safe. It's about whether Bitcoin can function as a transaction network at all in a post-quantum world.

And the migration challenge hasn't gotten any easier. Upgrading a decentralized network requires:

  • Community consensus on new signature schemes (a politically fraught process)
  • A controversial fork to implement post-quantum cryptography and potentially freeze lost assets
  • Individual migration of millions of distributed keys with no centralized authority
  • Months of dedicated block space just to process existing asset migrations

Post-quantum signatures are also significantly larger than current ECDSA signatures, increasing bandwidth, storage, and compute requirements across the network. This isn't a patch. It's a fundamental re-architecture of the system's cryptographic foundation.

If we wait for Q-Day to begin this process, it will already be too late.

The Bottom Line

The evidence from leading quantum physicists and cryptography experts at the world's most reputable institutions suggests quantum computing is advancing and may become cryptographically relevant sooner than the consensus timeline suggests. Both Google and Oratomic have now published work that dramatically compresses the resource requirements for breaking the exact cryptography that secures trillions in digital assets.

There is no proof that a cryptographically relevant quantum computer will arrive next year, or even in five years. But there is also no proof that it won't. And the asymmetry of risk here is stark: preparing too early has little downside (and may even remove a cloud of uncertainty over digital assets), while preparing too late could be existential.

The prudent course is the same one the rest of the world is already taking: acknowledge quantum as a real and accelerating risk, and begin the migration to post-quantum cryptography now, while there's still runway to do it deliberately.

This week made the case for urgency harder to ignore. The question is, can we convert that urgency into tangible action?

I believe we can. From the migration to Proof-of-Stake on Ethereum, to the evolution of Bitcoin in the face of various threats throughout its history, blockchain communities have overcome the natural friction of decentralized consensus to help evolve and improve their respective protocols. There's no reason why it can't happen again, and why digital assets can't be the "good news story" leading the way on post-quantum migration.

What's Next?

If you are building or operating critical infrastructure in this ecosystem, whether as a protocol, L1, L2, custodian, exchange, or wallet, the right next step is to start. That doesn’t mean committing to a full migration tomorrow. It means understanding your exposure, evaluating realistic upgrade paths, and putting a plan in place before timelines compress further. This is a technically and operationally complex transition, and the teams that move early will have a meaningful advantage. If you’re thinking about where to begin, reach out to us. We’re already working through these challenges with partners across the stack and can help you navigate what comes next.

And if you’re an individual holding crypto, the most important thing you can do right now is stay informed. The pace of progress is accelerating, and the gap between perception and reality is where the real risk lives. We’ll continue breaking down new developments and what they actually mean in plain terms. Check if your Bitcoin holdings have exposed public keys using the Bitcoin Risq List (cited in Google’s paper), subscribe to the P11 Bulletin, and follow us on X to stay on top of what matters.


Future-proofing. Future building. 

- Project Eleven