Welcome to the seventh edition. This week we explore how protocols are harder than they sound. Let’s dive right in…
Squashing Post-Quantum Bugs
Cloudflare just shared some fascinating details about rolling out post-quantum cryptography (PQC) at scale.
As a reminder, Cloudflare servers sit between users and the website servers they are trying to access. Cloudflare protects the website servers from denial-of-service attacks, as well as providing users with cached copies of website data to improve performance.
In this latest step towards quantum resilience, Cloudflare is deploying PQC algorithms to protect traffic on the final. hop between their servers and the website servers.
Along the way, they've had to wrestle with some interesting design decisions. For instance, many website servers do not support PQC algorithms yet, so it may not be efficient to pre-compute PQC key shares, as TLS 1.3 allows us to do.
Equally, they have to handle routing hardware that drops packets because they don't anticipate the size of a Kyber cipher.
I won't repeat all the details here, since Cloudflare writes excellent blogs. I will just note that while some companies deploy PQC as a marketing tool, Cloudflare seems to be genuinely trying to uncover bugs for the benefit of everyone. Kudos to them.
Don’t Judge an Algorithm by Its Name
We need more of this! The UK NCSC publishes new cryptographic modes to address the biggest threat of all...
Bad coding.
Recognising that many cryptographic screw-ups are caused by human error, the NCSC has published two block cipher modes that reduce this risk.
Block cipher modes define how to build a practical cryptographic system out of a building block like the AES algorithm. They describe how to cut up a big message into chunks, which can be encrypted and woven together to form an encrypted message.
Many of the older cipher modes, like ECB or CBC, have fallen out of favour due to security weaknesses. Other modes are secure when used correctly, but prone to errors.
By contrast, the two new modes from NCSC are designed to be idiot-proof.
The modes are known as GLEVIAN and VIGORNIAN, which I consider to be reassuringly terrible names. The authors are clearly focused on the art of cryptography, rather than marketing. Which is what we want.
The NCSC is aiming for these modes to be adopted into NIST standards in due course. If you want to learn more, links to their announcement and paper are below.
NCSC announcement: https://www.ncsc.gov.uk/blog-post/building-on-our-history-cryptographic-research.
Direct link to paper: https://eprint.iacr.org/2023/1379.
What The Simpsons Teaches Us about Crypto
This made me laugh (and cry a little).
Somebody created a website that tracks how many days since the last vulnerability was introduced by JSON web tokens (JWTs).
Like the "[X] days without an accident" sign in The Simpsons, I don't think the number ever gets very high.
Many authorisation systems rely on JWTs. Once you log into such a system, you get issued a JWT as a credential. It should be a digitally signed document that attests to your identity, plus the permissions you should receive.
The auth system can sign a JWT with different algorithms. Because of this, there is an "alg" field in the JWT that specifies the signing algorithm used. It might say "alg=HS256", for instance.
So far, so good. However, the specification was designed to be (overly) flexible and support unsigned JWTs too. So it is conformant to say "alg=none".
Perhaps you can now see the problem. Too many systems will blindly trust the contents of the JWT. This includes ignoring the signature validation if the "alg" parameter is "none".
An attacker can create a valid JWT by filling out all the permissions they want, and specifying "alg=none" and sending this to the auth server.
Whoops. Unfortunately, allowing so much flexibility in the specification was a recipe for disaster when it came to implementation.
Link to the website: https://www.howmanydayssinceajwtalgnonevuln.com.
A longer write-up on the topic, including suggested mitigations: https://medium.com/@phosmet/forging-jwt-exploiting-the-none-algorithm-a37d670af54f.
Superposition of Opinions
Was the NSA wrong when it criticised quantum key distribution (QKD)?
Back in 2020, the NSA put out some strong statements arguing against the use of QKD. They listed 5 limitations:
QKD is only a partial solution (because it requires classical authentication).
QKD requires special purpose equipment.
QKD increases infrastructure costs and insider threat risks.
Securing and validating QKD is a significant challenge.
QKD increases the risk of denial of service.
Strong stuff.
Yet, earlier this summer, a paper was published by two leading physics researchers, which argues all of these limitations will be resolved as the technology matures. They further explain that several limitations are unfair, and apply equally to classical systems.
As usual, when two groups of very bright people say different things, the truth likely lies somewhere in between. I'm optimistic about the future of QKD, but it's not a silver bullet and there are still many challenges to solve before it's deployed at scale.
If you're interested to see the details of the rebuttal, check out the introductory section of the paper (link below). It's a very short read if you stick to the summary.
Link to the paper: https://arxiv.org/abs/2307.15116.
Link to the NSA statement: https://www.nsa.gov/Cybersecurity/Quantum-Key-Distribution-QKD-and-Quantum-Cryptography-QC/.
Preventing Internet Rust
Here's a brilliant idea for enforcing crypto-agility. It's called GREASE, and I had no idea web browsers were using it.
When your web browser creates a secure TLS session, it advertises the cryptographic algorithms it supports. This takes the form of a list of numbers, which identify different cipher suites.
For instance, the cipher suite "TLS_AES_256_GCM_SHA384" is represented by a number, and it means 256-bit AES encryption in Galois counter mode with SHA-348 hashing.
A server should ignore cipher suite numbers it doesn't recognise. This allows new cipher suites to be introduced without breaking backwards compatibility.
However, some networking hardware and servers were getting into the habit of rejecting unknown cipher suites. This was creating a ticking time bomb that would explode when the world tried to deploy a new cipher suite. Much like we are trying to do right now with post-quantum crypto.
The solution was simple and elegant. It's called GREASE, which stands for "Generate Random Extensions And Sustain Extensibility".
The idea behind GREASE is that web browsers send random fake cipher suites when connecting to a server. They do this in every connection, which forces Internet hardware to cope with crypto agility.
I love clever little ideas like this. And I'm wondering if we can apply these lessons elsewhere in cryptography, as we move towards an age where agility is going to be very important.
You can read more about GREASE in the link to the RFC below.
Link to the RFC that defines GREASE: https://datatracker.ietf.org/doc/html/rfc8701.