I consider the recent work by Mosca et al to be the most up-to-date in terms of research estimation: https://www.sciencedirect.com/science/article/pii/S0167739X24004308

The estimate he provides is approximately an order of magnitude less work required to break ECDSA (P-256) vs RSA-2048. Ironically, the longer bit-lengths of RSA seem to actually contribute to post-quantum security, even though the motivation for moving from RSA-1024 was to protect against NFS and other classical attacks against shorter RSA instances. 

Note that the resource estimation in the paper doesn't account for Gidney's speedup, which was 20x reduction in qubits required. It's unclear whether that same improvement factor could be applied here; as the Gidney paper showed, the earliest CRQCs will probably be hardwired for certain circuits for performance reasons. E.g. Gidney's circuit layout works for RSA-2048 and that's it. But the ideas he presents around error correction (e.g. the yoked surface code) might apply more broadly, it's hard to say. 

Also note that many of his assumptions are based on a superconducting architecture, which generally have faster runtimes but lower stability (so scaling is harder)

Other architectures like this one https://arxiv.org/pdf/2506.20660 from the neutral atom community have slower runtimes but greater stability. But even if you scale, it probably only works for targeted, long-range attacks vs specific PKs as a CRQC.

Lots of variables to consider here in terms of estimating the timeline for a CRQC, but the proactive approach is probably the right one, because (to quote Gidney in his conclusion) we should "prefer security to not be contingent on progress being slow."

On Tuesday, August 12, 2025 at 3:04:32 AM UTC-6 ArmchairCryptologist wrote:

An astute observation. To clarify the quantum computing landscape: Google's current quantum processors do not possess 50 logical qubits, and even if they did, this would be insufficient to compromise ECDSA - let alone RSA-2048, which would require approximately 20 million noisy physical qubits for successful cryptanalysis [0].

That paper is pretty old. There is a recent paper from a couple of months ago by the same author (Craig Gidney from Google Quantum AI) claiming that you could break RSA-2048 with around a million noisy qubits in about a week. 


I can't say for sure whether this approach can be applied to ECDSA; I have seen claims before that it has less quantum resistance than RSA-2048, but I'm unsure if this is still considered to be the case. And while these papers are of course largely theoretical in nature since nothing close to the required amount of qubits exists at this point, I haven't seen anyone refute these claim at this point. These is still no hard evidence I'm aware of that a quantum computer capable of breaking ECDSA is inevitable, but given the rate of development, there could be some cause of concern.

Getting post-quantum addresses designed, implemented and activated by 2030 in accordance with the recommendations in this paper seems prudent to me, if this is at all possible. Deactivating inactive pre-quantum UTXOs with exposed public keys by 2035 should certainly be considered. But I still don't feel like deactivating pre-quantum UTXOs without exposed public keys in general is warranted, at least until a quantum computer capable of breaking public keys in the short time between they are broadcast and included in a block is known to exist - and even then, only if some scheme could be devised that still allows spending them using some additional cryptographic proof of ownership, ZKP or otherwise.

--
Best,
ArmchairCryptologist

--
You received this message because you are subscribed to the Google Groups "Bitcoin Development Mailing List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bitcoindev+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/bitcoindev/80005f10-e9af-4b4f-a05f-de2bd666d8ccn%40googlegroups.com.