1

When you think about typical SSL-Interception in Firewalls, depending on the number of TLS-Connections, loads of Certificates get faked "on the fly" in a very time critical environment.

In my experience on running a VPN Service based on OpenVPN you'll always want your entropy high enough to be sure that there is always enough entropy for creating "good" new certificates.

So, considering the massive amount of generated certificates, how do typical firewalls ensure they have enough entropy? Do they even do that? And if not, is this just a theoretical security risk or are there know attacks based on the possibly weak entropy?

architekt
  • 996
  • 1
  • 7
  • 18
  • 2
    I am not sure I understand what you are exactly asking for but entropy is ensured by the RNG (random number generator). If it is FIPS certified you can be sure the entropy is ok. – Fis May 31 '17 at 07:24
  • 1
    It will vary by appliance. Typically the larger the firewall the more CPU and Memory it will have, and can thus generate more certificates, and larger pools of entropy because they will generally make use of hardware random number generators (HRNG) suitable for the size of appliance – ISMSDEV May 31 '17 at 07:24
  • You only need to seed a PRNG once with sufficient entropy (~200 bits). It can then output unlimited amounts of cryptographically secure pseudo-random data that's indistinguishible from true random data. – CodesInChaos May 31 '17 at 08:29
  • Ofcourse, it is better to use some HRNG, i.e. based on PN junction noise (i.e. transistor or diode). – Fis May 31 '17 at 13:09

2 Answers2

1

Yes, there is a risk here, though it is almost entirely limited in scope to the network/users of such proxy/firewall devices. Low entropy could also have a very minor effect on the client nonce in the TLS handshake impacting outbound connections from the proxy/firewall device.

Low entropy might mean guessable primes, which means guessable private keys, which can lead to viable MITM attacks. If there's a low-entropy issue on such a system, and the CA key is the first thing generated, then you might have a real problem.

(There's an obvious beartrap with such systems if using a shared or vendor provided key/CA, but that's quite a different problem.)

Greater risk is generally assigned to issues of proper chain and revocation verification, and poor configuration leading to the use of insecure legacy SSL or ciphers, see US CERT TA17-075A HTTPS Interception Weakens TLS Security (to this I would add insufficient protection of private keys, at least one commercial product supports HSMs, though I suspect that only works for the internal CA rather than the on-demand certificates).

The potential issue here is that if the entropy is low, the primes may be predictable — but this alone doesn't make the modulus easier to factorise (to determine the private key). It does make a brute force attack viable, in this case brute forcing PRNG input (the attacker knows the PRNG) to attempt to discover a prime (and hence key) that's in use. If you happen to be an evil administrator with access to such a proxy that you can harvest keys and certificates from, then you have the deck stacked in your favour if attacking someone else's proxy. As an evil user you might be able to generate enough traffic to find collisions, but this is less interesting.

In practise (certainly the case with OpenSSL) a PRNG is used to generate the key source material (to search for primes), so very little "real" entropy is required. A proxy is generally a good candidate for harvesting entropy from timing and hardware events (potentially busy network and disk, with many random connections).

I estimate roughly 90-100kiB of PRNG output would generally be required for a 2048-bit key generation (2x 1024-bit primes, with a naïve generate/test/discard algorithm), requiring only a handful of non-PRNG bytes of "real" entropy to seed (32 bytes by observation). The delay in key generation is actually the primality testing.

For the implications of low-entropy keys you can read the (wittily titled) paper Mining Your Ps and Qs: Detection of Widespread Weak Keys in Network Devices (PDF).

There are two useful papers which discuss security issues around the use of such proxies:

These raise many other security concerns, but randomness/entropy is only referenced briefly in the second one:

Entropy during generation. It is possible that the entropy used during the generation of a new public/private key pair in install-time generated certificates is inadequate. In practice, since most products we analyzed generate a root certificate with RSA keys using OpenSSL, the generation process is expected to call certain known functions, e.g., RAND_seed(), RAND_event(), RSA_generate_key_ex(); we found calls to the last function in many cases. However, we did not investigate further the key generation algorithm in CCAs.

See also:

mr.spuratic
  • 7,977
  • 26
  • 37
-1

Just go google MITRE/CVE database related to entropy random issue, you will find some.

e.g

Beside Openvpn based services, the problem may also susceptible to chipset firmware vulnerabilities.

mootmoot
  • 2,397
  • 10
  • 16
  • 1
    None of these CVE applies to the creation of keys for dynamically generated certificates when doing SSL interception. Thus it does not answer the question. – Steffen Ullrich May 31 '17 at 19:19
  • @SteffenUllrich : Iinterpret the OP question on "how do typical firewalls ensure they have enough entropy?" – mootmoot Jun 01 '17 at 11:27