8

UUID uses 128-bit numbers. Currently it is not feasible to randomly produce two UUIDs that are the same. Does that mean 128 bits of entropy is suitable for all cryptographic operations? I would assume that cryptographic algorithms that use larger keys than 128 bits do so because there are mathematical relationships between the text, encrypted text, private key, and public key. For example, if you had the plain text and the encrypted text, a weaker algorithm would allow you to attack the used key by using the mathematical relationships between the two known parameters to get to the third one.

To put it another way: if I seed a cryptographically secure pseudorandom number generator with 128 bits of entropy, is it safe to use that CSPRNG to generate any key?

  1. 4096-bit RSA key?
  2. 256-bit ECC key?
  3. 256-bit AES key?
  4. 256-bit HMAC key?
guissoares
  • 103
  • 2
dodtsair
  • 93
  • 1
  • 3

2 Answers2

21

128 bits of entropy are enough. The whole and only point of considering entropy is to make sure that the system can resist brute force attacks: the space of possible values must be so large that any attacker could only try a negligible proportion of the values in non-ludicrous time. There are strong reasons why 128 bits are highly sufficient for that. Realistically, a very powerful organization with lots of cash to spare (including behemoths like Apple or Google, who can muster much more resources than usual scarecrows like the NSA) could hope to perform, say, at most 285 elementary computations (like encrypting an AES block) within a year -- it won't be discreet, and that's still less than the millionth part of the millionth part of a 2128 space.

In your list of algorithm, if you generate a 256-bit AES key from a 128-bit seed and a strong PRNG, then you do not really have a 256-bit key. From the attacker's point of view, this is a 128-bit key. Mind you, that's still more than enough to deter the attacker. But claiming "256 bits" here verges on the fraudulent (and, in the case of AES, implies a +40% runtime cost for no actual gain). The same can be said about HMAC, and, actually, all algorithms that are part of "symmetric cryptography".

For asymmetric algorithms, one can try to make estimates of strength and to seek equivalences with symmetric keys. This is very hard to do, because attacks on (say) RSA and AES do not use at all the same kind of technology (to make things simple: for RSA you need a lot of very very fast RAM; for AES you need no RAM at all, and should use relatively slow circuits to make them really cheap to build and operate). See this site for what several groups of smart people have come up with. The general consensus is that a 256-bit elliptic curve (at least, a "normal" curve) ought to be somewhat equivalent to a 128-bit symmetric key in strength; a 4096-bit RSA key might be a bit beyond, assuming that this assertion makes sense, which it does not (there is nothing stronger than "cannot break it" so comparing strengths beyond 100 bits or so is kinda meaningless anyway).


There is a relatively recent idea that floats around about quantum computers, that would somehow require a doubling of the number of bits. The theoretical reason is that Grover's search algorithm can explore a space of 2n inputs to a function (implemented on a quantum computer) by invoking the function 2n/2 times. Under that theory, you would need a 256-bit key to achieve the magical "128-bit security". However, there are two principal flaws in that theory:

  • The flaw that everybody points out is that quantum computers do not exist yet (well, almost everybody; some people purport to have quantum computers on sale). Decoherence is apparently very hard to surmount and we do not really know how to do it except by making the whole machine very very cold, and then praying that the Universe won't catch up on what we are trying to do for a few more milliseconds.

  • The flaw that nobody points out, but is at least as important as the other one, is that there is no indication that an elementary computation on a quantum computer would consume an amount of energy similar to an elementary computation on a classical computer. This is, however, a very crucial point. Since the invention of the transistor, computers have steadily become more efficient; it was theorized as Moore's law. There are also increasingly clearer indications that Moore's law is about to stop (Gordon Moore himself said so), mostly because we are close to the minimal size for electronic gates (if we make them any smaller then they leak madly through quantum tunnelling) and we are out of fresh ideas: the increase in power sine the 1970s fed on a pool of ideas that had all been conceived and enunciated in the 1970s, and now we are at the end of our wits.

    So we will soon reach the limits of classical computing and thus will actually know, with some appreciable reliability, just how much of a key space can be explored with classical computers. However it took about 70 years to reach that point.

    For quantum computers, since they don't really exist yet, we are at day one. It takes an awful lot of wishful thinking to assert that, 70 years from now, we will have reached the maximal optimization of quantum computers and that optimization will bring the cost of elementary quantum operations within the same order of magnitude of the cost of elementary classical operations. In fact nobody knows anything about that yet.

As an illustration on the difficulty of conceiving a clear notion of what technology will be able to achieve in the future, consider the following anecdote (that I recently read in that book): back around 1810, more than 25 years after the historical first free flight of Pilâtre de Rozier in an untethered balloon, the greatest thinkers of that time had conceived what the new invention could bring to mankind. In their view, the summit of the usefulness of a balloon was to tether it to your carriage, thereby making it lighter and allowing it to be drawn about with fewer horses. Think about that whenever you feel visionary.

Quantum computer efficiency being doubly speculative at that point, it is not really rational to include it in any thinking about key length. A much more salient point is that when quantum computers work, even if they do so abysmally slowly, then RSA, Diffie-Hellman and elliptic curves are toast. That is worth investigating, and this is what studies on Post-Quantum cryptography concentrate on.

Thomas Pornin
  • 322,884
  • 58
  • 787
  • 955
  • 1
    Brilliant answer. It could be worth pointing out that you can generate RSA and EC key pairs from 128 bits of entropy *as long as the algorithm is fully deterministic and static (and preferably known, so you can copy it elsewhere). This is not standard practice and fraught with dangers, but it *can* be done and *is* secure. OK, I'm off, I have to get flying horse carriages (w. horses) out of my head :) – Maarten Bodewes Oct 15 '15 at 16:20
  • If there are `3.28 * 10^80` particles in the observable universe wouldn't `log2(3.28 * 10^80) = 267.4` bits be better than 128 bits? – Brandon Feb 10 '22 at 06:42
0

For some uses, 128 bits isn't actually as strong as it sounds.

For example, a cryptographically-secure hash algorithm may only be about half as secure against collision attacks as you would expect. Consider a digital signature (which actually just signs a hash of the data, rather than performing an expensive asymmetric cryptographic algorithm on the full data); if the attacker can choose the valid input, then a birthday attack can be used to make the same signature also trusted for a malicious input. Finding such a pair of valid and invalid inputs will only require testing about 2^(N/2) (where N is the hash size) pairs. A 128 bit hash would thus require only about 2^64 pairs, which is extremely expensive for modern hardware but approaching feasibility.

The other major concern, aside from birthday attacks, is quantum computing (not to be confused with quantum cryptography, which is quite different). Quantum computers can, in theory, reduce the effective entropy of a cryptographic key in half. Thus, if you want 128 bits of effective entropy (and the key needs to last long enough for quantum computers to reach the ability to operate on significant-length keys), you should use 256 bits of random data.

CBHacking
  • 42,359
  • 3
  • 76
  • 107
  • 2
    A cryptographically-secure hash algorithm does *not* produce output indistinguishable from random. It's only properties are collision resistance and pre-image resistance, but the bits that it produces should *not* be considered as bits of entropy. – puzzlepalace Oct 07 '15 at 23:56
  • I don't believe I stated anything to the contrary, @puzzlepalace. Still, the question was about "bits of entropy", so the part about birthday attacks on hash functions is arguably off-topic for the reason you mention. – CBHacking Oct 08 '15 at 01:20