I haven't finished reading the paper, but maybe my biggest immediate takeaway is that significantly increasing the number of bits doesn't really help. Using the methods discussed in the paper, 4096 bit integers can be factored in less than 4x the amount of time needed to factor 2048 bit integers. In other words, should this method become practical, cryptographers couldn't solve the problem by using integers we would ordinarily consider much harder to factor.
It's been well-known for a long time that doubling the key bits doesn't help for current public-key cryptography because you end up only requiring (approximately) double the number of qubits and a sub-exponential time increase. After all, Shor's algorithm provides a polynomial-time algorithm to solve the problem so the worst case is that doubling n will cause a fixed-power-of-two increase in time (it turns out to 2^2 in this case). It's interesting to get estimates of difficulty given noisy qubits rather than the estimates given of running Shor's algorithm on perfect qubits.
However, symmetric cryptography like AES or ChaCha20 (assuming that the best-case quantum attack is a brute-force search) only gets a square-root speed-up and thus doubling the key bits actually does alleviate the quantum threat.
Yes, that is why djb‘s post quantum RSA proposal uses public key sizes on the order of one terabyte (i.e. 8 trillion bits) instead of the more common 4096 bits. I have the slight suspicion that the proposal is slightly tounge in cheek, though.
Yeah, that submission was absolutely a joke. It didn't get passed to round 2 of the NIST PQCrypto Standardization CFP. Bernstein and Lange intended that to be a bit of sardonic humor poking fun at the review panel.
That proposal was meant to be taken as a joke. The other proposals by djb and Lange (or possibly CSIDH -- the closest proposal to a usable post-quantum Diffie-Hellman -- which didn't make the submission deadline) seem far more reasonable.
Yes, the time scales quadratically in the key size while the space scales linearly (times a bit of logarithmic overhead on both). I think of the space as the limiting factor, since an attack that takes months to run is still worrying to cryptographers.
Good question. It looks like the rate at which more qubits are needed is slightly less than that for the time requirements. (So still under 50M qubits total, vs 20M for 2048 bit integers.)
They also say later on in the paper that, given certain constraints, it should be possible to use multiple quantum computers with fewer qubits per computer - at the cost of a slight increase in the total number of qubits.
TL;DR: yes, just moving from 2048 to 4096-bit keys will not suffice. But, if you make a composite key out of many 4096-bit keys (such that the key file is now ~1Tb) then then can achieve a reasonable level of security. And yes, I do think the paper is (at least in parts) somewhat tongue-in-cheek!
RSA has known to be broken by Shor's algorithm since the 90s, as well as Diffie-Hellman and elliptic-curve Diffie-Hellman. The only question is how practical is an attack going to be, and how much time do we have to switch to post-quantum algorithms. NIST has been running a competition for post-quantum cryptosystems since 2017 (approximately 10 years late to the party) and we'll see where we end up.