Did NIST Just Leave a Security Hole?
Don’t bother lengthening your RSA keys, says NIST. Just move straight to quantum-safe algorithms. But does that leave a security gap?
The new rules come from NIST SP 800-131A Rev. 3, which is out for public comment. There’s too much in this document to cover in one post. But I want to zoom in on the rules around key strengths.
In previous documents, NIST set a 2030 deadline for transitioning from 112-bit security to 128-bit security. This meant moving from to 3072-bit RSA or 256-bit ECDSA or stronger.
Since then, post-quantum algorithms have been standardised, and NIST feels it’s unfair to expect two big shifts in cryptography in a small space of time. As a result, they are encouraging organizations to shift directly from 112-bit security to post-quantum.
This is pragmatic and will reduce costs. But it does leave a security hole.
To give organisations time to move to post-quantum, they're removing the 2030 ban on 112-bit security. Instead, it will be merely deprecated. Meanwhile, there isn’t yet a NIST timeline for moving to post-quantum algorithms. I assume the date will be significantly after 2035 since that is the timeline for federal government migrations.
Why is this important? Well, it means a system in 2036 (perhaps) might be running on only 2048-bit RSA. Not only is this system more vulnerable to classical attacks, but 2036 is also right in the range of when quantum computers might break cryptography. And you need a less powerful quantum computer to break 2048-bit than longer bit lengths.
I shall be submitting a comment to suggest they rethink this approach. Yes, it will be harder to make two shifts. However, we know from experience that people avoid changing cryptography unless they are forced to do so. And even then, they drag their heels.
Let me know in a reply if you agree or disagree with this.
Cryptanalysis in Short Supply
With the fuss around the new PQC standards, it’s easy to forget NIST is still hunting for new signature algorithms.
Last week, the selection process entered its second round, with 14 candidates surviving from a pool of 40. This round of analysis will last up to eighteen months and will be followed by a third and final round.
Worryingly, NIST noted that “some of the second-round candidates have received little or no published cryptanalysis”.
Perhaps the academic community has grown tired of examining dozens of algorithms. Or maybe they’re focusing on breaking the new standards since that brings far more kudos...
You can read the report from NIST here: https://csrc.nist.gov/pubs/ir/8528/final.
CPU, GPU, … VPU?
Have you ever heard of a verifiable processing unit (VPU)? Me neither. But a startup claims VPUs will revolutionise high-performance cryptography, like GPUs revolutionised AI.
The company, Fabric, argues that high-performance cryptography needs hardware acceleration. However, the options available today are not great. Either you try your best with GPUs, which are not designed for cryptographic operations, or you spend a lot of money to build custom silicon.
They are working on a third option – a brand-new type of chip, with an instruction set designed to accelerate trendy crypto operations, like zero-knowledge proofs and fully homomorphic encryption.
I’ve never met the team at Fabric and have no idea if their technology is sound. But I shall watch with interest to see if the concept of a VPU takes off.
You can read a one-sided and investor-focused overview of their technology here: https://www.blockchaincapital.com/blog/the-dawn-of-a-new-era-in-cryptography-with-fabrics-innovative-vpu-technology.
Separately, this reminds me of the ongoing efforts to build a memory-safe processor architecture, which started as the CHERI project at Cambridge University. The CHERI team has since partnered with ARM and Microsoft to bring this closer to reality: https://newsroom.arm.com/news/morello-research-program-hits-major-milestone-with-hardware-now-available-for-testing.