> "but then WHAT is a good measure for QC progress?" [...] you should disregard quantum factorization records.
> The thing is: For cryptanalytic quantum algorithms (Shor, Grover, etc) you need logical/noiseless qubits, because otherwise your computation is constrained [...] With these constraints, you can only factorize numbers like 15, even if your QC becomes 1000x "better" under every other objective metric. So, we are in a situation where even if QC gets steadily better over time, you won't see any of these improvements if you only look at the "factorization record" metric: nothing will happen, until you hit a cliff (e.g., logical qubits become available) and then suddenly scaling up factorization power becomes easier. It's a typical example of non-linear progress in technology (a bit like what happened with LLMs in the last few years) and the risk is that everyone will be caught by surprise. Unfortunately, this paradigm is very different from the traditional, "old-style" cryptanalysis handbook, where people used to size keys according to how fast CPU power had been progressing in the last X years. It's a rooted mindset which is very difficult to change, especially among older-generation cryptography/cybersecurity experts. A better measure of progress (valid for cryptanalysis, which is, anyway, a very minor aspect of why QC are interesting IMHO) would be: how far are we from fully error-corrected and interconnected qubits? [...] in the last 10 or more years, all objective indicators in progress that point to that cliff have been steadily improving
I agree with the statement that measuring the performance of factorisation now is not a good metric to assess progress in QC at the moment. However, the idea that once logical qubits become available, we reach a cliff, is simply wishful thinking.
Have you ever wondered what will happen to those coaxial cables seen in every quantum computer setup, which scale approximately linearly with the number of physical qubits? Multiplexing is not really an option when the qubit waiting for its control signal decoheres in the meantime.
Oh, I didn't mean to imply that the "cliff" is for certain. What I'm saying is that articles like Gutmann's fail to acknowledge this possibility.
Regarding the coaxial cables, you seem to be an expert, so tell me if I'm wrong, but it seems to me a limitation of current designs (and in particular of superconducting qubits), I don't think there is any fundamental reason why this could not be replaced by a different tech in the future. Plus, the scaling must not need to be infinite, right? Even with current "coaxial cable tech", it "only" needs to scale up to the point of reaching one logical qubit.
> I don't think there is any fundamental reason why this could not be replaced by a different tech in the future.
The QC is designed with coaxial cables running from the physical qubits outside the cryostat because the pulse measurement apparatus is most precise in large, bulky boxes. When you miniaturise it for placement next to qubits, you lose precision, which increases the error rate.
I am not even sure whether logical components work at such low temperatures, since everything becomes superconducting.
> Even with current "coaxial cable tech", it "only" needs to scale up to the point of reaching one logical qubit.
Having a logical qubit sitting in a big box is insufficient. One needs multiple logical qubits that can be interacted with and put in a superposition, for example. A chain of gates represents each logical qubit gate between each pair of physical qubits, but that's not possible to do directly at once; hence, one needs to effectively solve the 15th puzzle with the fewest steps so that the qubits don't decohere in the meantime.
> I am not even sure whether logical components work at such low temperatures, since everything becomes superconducting.
Currently finishing a course where the final project is designing a semiconductor (quantum dot) based quantum computer. Obviously not mature tech yet, but we've been stressed during the course that you can build most of the control and readout circuits to work at cryogenic temps (2-4K) using slvtfets. The theoretical limit for this quantum computing platform is, I believe, on the order of a million qubits in a single cryostat.
> you can build most of the control and readout circuits to work at cryogenic temps (2-4K) using slvtfets
Given the magic that happens inside high-precision control and readout boxes connected to qubits with coaxial cables, I would not equate the possibility of building one with such a control circuit ever reaching the same level of precision. I find it strange that I haven’t seen that on the agenda for QC, where instead I see that multiplexing is being used.
> The theoretical limit for this quantum computing platform is, I believe, on the order of a million qubits in a single cryostat.
Shor's algorithm has been known for a while now (apparently since 1994) and is the main reason quantum-resistant cryptography became an important research subject. The article explains it nicely (for someone like me who doesn't know nearly enough physics or maths to fully understand the technical parts), but this bit at the end ruins it a bit:
> Rotate everything that lasts >10 years to pure PQC now
The author suggests switching to Post-Quantum Cryptography which uses relatively new ciphers that haven't been as battle-tested as older ones like RSA and ECC. Back when those were introduced, there weren't any stronger ciphers at the time, so if they were broken, at least people knew they did the best they could to protect their data.
Now, however, we have standardized encryption with (to the general public's knowledge at least) uncrackable algorithms (provided sane key lengths are chosen), so doing anything that could weaken our encryption makes us worse than the baseline. This proposal is theoretically stronger, but it is unknown whether it will stand the test of time, even with today's technology, due to it being relatively new and not widely deployed.
The standard practice of rolling out PQC is using it as an additional layer alongside current encryption standards. This adds redundancy, so that if one is broken the data will stay safe. Using only PQC or only RSA/ECC/whatever makes the system have a single point of failure.
First of all, thanks for the thoughtful comment and link.
You're right that rotating every crypto algo to PQC right away might be a bit too aggressive. The actual best practice (like you said) is hybrid: layer ML-KEM/ML-DSA on top of RSA/ECC for redundancy.
Classical algos aren't dead yet, but Shor's clock is ticking, and for now those NIST-standardized (FIPS203 for ML-KEM, FIPS204 for ML-DSA) PQC algos didn't break for now.
That's why Cloudflare for example uses ML-KEM alongside X25519 for their TLS key exchange (https://cyberpress.org/cloudflare-enhances-security/).
And yeah.. presenting a single algo as the perfect solution. That gives Dual_EC vibes, perfect spot for a backdoor.
It'd be essentially impossible to add a NOBUS backdoor into ML-KEM, there's nowhere to hide a key. The reason not to go all-in on it is simply that it might be broken.
Even 21 was only possible by cheating (optimizing away the difficult part using prior knowledge of the results) [1]. Craig Gidney has a blog post that shows the actual quantum circuit for factoring 21 which is far beyond the capabilities of current quantum computers [2].
Overview of QC factoring records, applied sleight-of-hand tricks, and their replication using a VIC-20 8-bit home computer from 1981, an abacus, and a dog:
*ends as soon as practical quantum computers, something which might never happen, exist.
The author mentions:
> RSA-2048: ~4096 logical qubits, 20-30 million physical qubits
> 256-bit ECC: ~2330 logical qubits, 12-15 million physical qubits
For reference, we are at ~100 physical qubits right now. There is a bit of nuance in the logical to physical correlation though.
Scepticism aside, the author does mention that it might be a while in the future, and it is probably smart to start switching to quantum resistant cryptography for long-running, critical systems, but I'm not a huge fan of the fear-mongering tone.
And no clear quantum Moore law emerging for the yearly increase in qbits (https://arxiv.org/abs/2303.15547)... The quantum panic pushes people to deploy immature solutions, and the remedy sure sometimes looks worse than the illness...
About those sizes: they increase with the size of the key, right? So I would think the author's claim that RSA-8192 is just as vulnerable as RSA-4096 isn't quite as straight-forward. It would require considerably more qbits.
Yeah, I was taking the author's numbers there, and there is a lot of nuance to the logical vs physical qubits relationship. Not super up to date on the latest work there, you got any links?
"How to factor 2048 bit RSA integers with less than a million noisy qubits" (https://arxiv.org/abs/2505.15917) is the most up to date paper here, and uses ~1400 logical and ~900k physical
The fear-mongering tone is likely due to the fact that this was posted (though probably not written) by a company promoting quantum-safe cloud storage.
Anyway, here is what Scott Aaronson recently said about quantum computing progress:
> Indeed, given the current staggering rate of hardware progress, I now think it’s a live possibility that we’ll have a fault-tolerant quantum computer running Shor’s algorithm before the next US presidential election. And I say that not only because of the possibility of the next US presidential election getting cancelled, or preempted by runaway superintelligence! (...)
> To clarify — if, before the 2028 presidential election, a fully fault-tolerant Shor’s algorithm was used even just to factor 15 into 3×5, I would view the “live possibility” here as having come to pass.
> The point is, from that point forward, it seems like mostly a predictable matter of adding more fault-tolerant qubits and scaling up, and I find it hard to understand what the showstopper would be.
I was actually reading his blog again last night (after chatting with a friend about QQ), and he has a follow up post, titled: "Quantum Investment Bros: Have you no shame?"
Relevant quote:
> It’s like this: if you think quantum computers able to break 2048-bit cryptography within 3-5 years are a near-certainty, then I’d say your confidence is unwarranted. If you think such quantum computers, once built, will also quickly revolutionize optimization and machine learning and finance and countless other domains beyond quantum simulation and cryptanalysis—then I’d say that more likely than not, an unscrupulous person has lied to you about our current understanding of quantum algorithms.
And:
> In any case, the main reason I made my remark was just to tee up the wisecrack about whether I’m not sure if there’ll be a 2028 US presidential election.
So I would be careful posting those quotes without context, it makes Scott angry.
It is not the only reason. The best reason imo is to do quantum chemistry calculations, which are inordinately difficult to do accurately (O(n!) if you need to take into account all correlations)
Given some of the comments in this thread, I would like to link this here:
https://gagliardoni.net/#20250714_ludd_grandpas
An abstract:
> "but then WHAT is a good measure for QC progress?" [...] you should disregard quantum factorization records.
> The thing is: For cryptanalytic quantum algorithms (Shor, Grover, etc) you need logical/noiseless qubits, because otherwise your computation is constrained [...] With these constraints, you can only factorize numbers like 15, even if your QC becomes 1000x "better" under every other objective metric. So, we are in a situation where even if QC gets steadily better over time, you won't see any of these improvements if you only look at the "factorization record" metric: nothing will happen, until you hit a cliff (e.g., logical qubits become available) and then suddenly scaling up factorization power becomes easier. It's a typical example of non-linear progress in technology (a bit like what happened with LLMs in the last few years) and the risk is that everyone will be caught by surprise. Unfortunately, this paradigm is very different from the traditional, "old-style" cryptanalysis handbook, where people used to size keys according to how fast CPU power had been progressing in the last X years. It's a rooted mindset which is very difficult to change, especially among older-generation cryptography/cybersecurity experts. A better measure of progress (valid for cryptanalysis, which is, anyway, a very minor aspect of why QC are interesting IMHO) would be: how far are we from fully error-corrected and interconnected qubits? [...] in the last 10 or more years, all objective indicators in progress that point to that cliff have been steadily improving
I agree with the statement that measuring the performance of factorisation now is not a good metric to assess progress in QC at the moment. However, the idea that once logical qubits become available, we reach a cliff, is simply wishful thinking.
Have you ever wondered what will happen to those coaxial cables seen in every quantum computer setup, which scale approximately linearly with the number of physical qubits? Multiplexing is not really an option when the qubit waiting for its control signal decoheres in the meantime.
Oh, I didn't mean to imply that the "cliff" is for certain. What I'm saying is that articles like Gutmann's fail to acknowledge this possibility.
Regarding the coaxial cables, you seem to be an expert, so tell me if I'm wrong, but it seems to me a limitation of current designs (and in particular of superconducting qubits), I don't think there is any fundamental reason why this could not be replaced by a different tech in the future. Plus, the scaling must not need to be infinite, right? Even with current "coaxial cable tech", it "only" needs to scale up to the point of reaching one logical qubit.
> I don't think there is any fundamental reason why this could not be replaced by a different tech in the future.
The QC is designed with coaxial cables running from the physical qubits outside the cryostat because the pulse measurement apparatus is most precise in large, bulky boxes. When you miniaturise it for placement next to qubits, you lose precision, which increases the error rate.
I am not even sure whether logical components work at such low temperatures, since everything becomes superconducting.
> Even with current "coaxial cable tech", it "only" needs to scale up to the point of reaching one logical qubit.
Having a logical qubit sitting in a big box is insufficient. One needs multiple logical qubits that can be interacted with and put in a superposition, for example. A chain of gates represents each logical qubit gate between each pair of physical qubits, but that's not possible to do directly at once; hence, one needs to effectively solve the 15th puzzle with the fewest steps so that the qubits don't decohere in the meantime.
> I am not even sure whether logical components work at such low temperatures, since everything becomes superconducting.
Currently finishing a course where the final project is designing a semiconductor (quantum dot) based quantum computer. Obviously not mature tech yet, but we've been stressed during the course that you can build most of the control and readout circuits to work at cryogenic temps (2-4K) using slvtfets. The theoretical limit for this quantum computing platform is, I believe, on the order of a million qubits in a single cryostat.
> you can build most of the control and readout circuits to work at cryogenic temps (2-4K) using slvtfets
Given the magic that happens inside high-precision control and readout boxes connected to qubits with coaxial cables, I would not equate the possibility of building one with such a control circuit ever reaching the same level of precision. I find it strange that I haven’t seen that on the agenda for QC, where instead I see that multiplexing is being used.
> The theoretical limit for this quantum computing platform is, I believe, on the order of a million qubits in a single cryostat.
What are the constraints here?
Shor's algorithm has been known for a while now (apparently since 1994) and is the main reason quantum-resistant cryptography became an important research subject. The article explains it nicely (for someone like me who doesn't know nearly enough physics or maths to fully understand the technical parts), but this bit at the end ruins it a bit:
> Rotate everything that lasts >10 years to pure PQC now
The author suggests switching to Post-Quantum Cryptography which uses relatively new ciphers that haven't been as battle-tested as older ones like RSA and ECC. Back when those were introduced, there weren't any stronger ciphers at the time, so if they were broken, at least people knew they did the best they could to protect their data.
Now, however, we have standardized encryption with (to the general public's knowledge at least) uncrackable algorithms (provided sane key lengths are chosen), so doing anything that could weaken our encryption makes us worse than the baseline. This proposal is theoretically stronger, but it is unknown whether it will stand the test of time, even with today's technology, due to it being relatively new and not widely deployed.
The standard practice of rolling out PQC is using it as an additional layer alongside current encryption standards. This adds redundancy, so that if one is broken the data will stay safe. Using only PQC or only RSA/ECC/whatever makes the system have a single point of failure.
FYI, this is exactly what governments want (I'll let you guess why). This related post was on the front page just a few days ago: https://news.ycombinator.com/item?id=46033151
First of all, thanks for the thoughtful comment and link.
You're right that rotating every crypto algo to PQC right away might be a bit too aggressive. The actual best practice (like you said) is hybrid: layer ML-KEM/ML-DSA on top of RSA/ECC for redundancy. Classical algos aren't dead yet, but Shor's clock is ticking, and for now those NIST-standardized (FIPS203 for ML-KEM, FIPS204 for ML-DSA) PQC algos didn't break for now. That's why Cloudflare for example uses ML-KEM alongside X25519 for their TLS key exchange (https://cyberpress.org/cloudflare-enhances-security/).
And yeah.. presenting a single algo as the perfect solution. That gives Dual_EC vibes, perfect spot for a backdoor.
It'd be essentially impossible to add a NOBUS backdoor into ML-KEM, there's nowhere to hide a key. The reason not to go all-in on it is simply that it might be broken.
The largest number factored by Shor's algorithm is 21.
https://en.wikipedia.org/wiki/Integer_factorization_records
Even 21 was only possible by cheating (optimizing away the difficult part using prior knowledge of the results) [1]. Craig Gidney has a blog post that shows the actual quantum circuit for factoring 21 which is far beyond the capabilities of current quantum computers [2].
[1] https://www.nature.com/articles/nature12290
[2] https://algassert.com/post/2500
Overview of QC factoring records, applied sleight-of-hand tricks, and their replication using a VIC-20 8-bit home computer from 1981, an abacus, and a dog:
https://eprint.iacr.org/2025/1237.pdf
And it was done in 2012. I admit I’m surprised there hasn’t been more progress since.
Replication of Quantum Factorisation Records with an 8-bit Home Computer, an Abacus, and a Dog
https://eprint.iacr.org/2025/1237.pdf
*ends as soon as practical quantum computers, something which might never happen, exist.
The author mentions: > RSA-2048: ~4096 logical qubits, 20-30 million physical qubits > 256-bit ECC: ~2330 logical qubits, 12-15 million physical qubits
For reference, we are at ~100 physical qubits right now. There is a bit of nuance in the logical to physical correlation though.
Scepticism aside, the author does mention that it might be a while in the future, and it is probably smart to start switching to quantum resistant cryptography for long-running, critical systems, but I'm not a huge fan of the fear-mongering tone.
And no clear quantum Moore law emerging for the yearly increase in qbits (https://arxiv.org/abs/2303.15547)... The quantum panic pushes people to deploy immature solutions, and the remedy sure sometimes looks worse than the illness...
Great manuscript to share.
I will highlight to others that while the qubit count is not increasing exponentially, other metrics are.
About those sizes: they increase with the size of the key, right? So I would think the author's claim that RSA-8192 is just as vulnerable as RSA-4096 isn't quite as straight-forward. It would require considerably more qbits.
You mean it will come right when AGI comes?
Fusion powered AGI!
Fusion powered Quantum AGI! (on the blockchain?) ;-)
In your flying car, no less.
That drives/flies itself
No, no, at that point they'll busy figuring out an actually quantum-proof blockchain (but powered by AGI)
sota for rsa 2048 is <1 million physical qbits
Yeah, I was taking the author's numbers there, and there is a lot of nuance to the logical vs physical qubits relationship. Not super up to date on the latest work there, you got any links?
"How to factor 2048 bit RSA integers with less than a million noisy qubits" (https://arxiv.org/abs/2505.15917) is the most up to date paper here, and uses ~1400 logical and ~900k physical
Thanks!
The fear-mongering tone is likely due to the fact that this was posted (though probably not written) by a company promoting quantum-safe cloud storage.
Anyway, here is what Scott Aaronson recently said about quantum computing progress:
> Indeed, given the current staggering rate of hardware progress, I now think it’s a live possibility that we’ll have a fault-tolerant quantum computer running Shor’s algorithm before the next US presidential election. And I say that not only because of the possibility of the next US presidential election getting cancelled, or preempted by runaway superintelligence! (...)
> To clarify — if, before the 2028 presidential election, a fully fault-tolerant Shor’s algorithm was used even just to factor 15 into 3×5, I would view the “live possibility” here as having come to pass.
> The point is, from that point forward, it seems like mostly a predictable matter of adding more fault-tolerant qubits and scaling up, and I find it hard to understand what the showstopper would be.
https://scottaaronson.blog/?p=9325
I was actually reading his blog again last night (after chatting with a friend about QQ), and he has a follow up post, titled: "Quantum Investment Bros: Have you no shame?"
Relevant quote:
> It’s like this: if you think quantum computers able to break 2048-bit cryptography within 3-5 years are a near-certainty, then I’d say your confidence is unwarranted. If you think such quantum computers, once built, will also quickly revolutionize optimization and machine learning and finance and countless other domains beyond quantum simulation and cryptanalysis—then I’d say that more likely than not, an unscrupulous person has lied to you about our current understanding of quantum algorithms.
And:
> In any case, the main reason I made my remark was just to tee up the wisecrack about whether I’m not sure if there’ll be a 2028 US presidential election.
So I would be careful posting those quotes without context, it makes Scott angry.
I don't see what you think is so misleading about those quotes that you think Scott Aaronson would be angry to see them posted.
...and 100, quite useless qubits too, with insane error rates and extremely fast decoherence times.
If this is the only reason to build quantum computers then it sounds pretty useless to me. https://xkcd.com/538/
It is not the only reason. The best reason imo is to do quantum chemistry calculations, which are inordinately difficult to do accurately (O(n!) if you need to take into account all correlations)
That article is likely LLM generated. It has the typical signs and a Grok-like pseudo casual tone.
[dead]