3

It's my understanding that the purpose of .htpasswd files is to restrict access to some files in the server filesystem. If an attacker gains access to it, he probably has access to everything else too, so it seems like there's little point in using a slow hash in this situation (would only increse the server load). However, I've just seen that Apache 2.4 introduced support for bcrypt in these files:

*) htpasswd, htdbm: Add support for bcrypt algorithm (requires apr-util 1.5 or higher). PR 49288. [Stefan Fritsch]

The feature request does not offer any explanation about why using a simpler hash make "passwords stored in those hash functions vulnerable to cracking". Is this just "paranoia" or is there a good reason for using a stronger hash in this scenario?

mgibsonbr
  • 2,925
  • 2
  • 21
  • 35

2 Answers2

3

There are two main reasons why the passwords in a .htpasswd file need secure password hashing:

  1. Though an attacker who can read the .htpasswd file can probably read the other files in the directory, the password may also allow for write access. There are a number of situations where an attacker can read private files but not write to them; for instance, when the attacker just found a discarded hard disk or backup tape in a dumpster. Similarly, passwords may have a lifetime longer than the data that they protect: an old .htpasswd could contain a password which is still valid now, to access data files which did not exist at all then.

  2. Password hashing does not ultimately protect the data, but the password itself. Passwords are sensitive data in their own right, because:

    • Users reuse passwords. A password for a user on a given server may be also valid on other systems where the user has an account. I am personally enough of a maniac to generate different random passwords for all servers on which I have accounts, but I know that most people don't do things that way.

    • Users don't make random passwords. In particular, when forced to change their passwords (as is unfortunately mandated on a regular basis by poorly thought out policies), users rely on sequences: if their current password is Lilongwe37, then chances are that their next password will be Lilongwe38. There again, an old password, even after what looks like a "password reset", will be valuable to an attacker.

For these reasons, correct password hashing in .htpasswd files is important.

Thomas Pornin
  • 322,884
  • 58
  • 787
  • 955
1

Password cracking, using brute-force, dictionary, or rainbow table attacks (or any combination of these), are all techniques that guesses the password.

On a computing system, guessing passwords takes time, and herein lies the advantage of a slow (i.e. computationally more complex) hash function. The longer a hash function takes to calculate, the more time it will take to successfully guess (crack) a password.

This is why slow hash functions are superior to fast ones, with regards to the level of security they provide.

Indeed, if a hacker can get (root) access to the web server, he or she can steal the .htpasswd file (or better: it's contents), and attempt to crack the passwords off-line. The reason why the passwords are stored in a (slow to calculate) hashed format, is to provide an additional layer of security: The credentials stored in the file might be used on other computing systems or services.

Steven Volckaert
  • 1,193
  • 8
  • 15
  • "The credentials stored in the file might be used on other computing systems or services" yes, that might be a valid reason indeed. Otherwise, what would be the point of cracking the passwords? The files this password protects are just *there* - sitting alongside the `.htpasswd` file - the attacker can just grab them already! But if the same passwords also protects *something else* (the likelyhood of which I can't guess), then I can see the benefit of properly protecting them. – mgibsonbr Apr 28 '14 at 08:52
  • Wont a slow hash make generating a rainbow table slower, rather than slow its use? – Jay Apr 28 '14 at 11:39
  • @Jay On a given processor architecture, a slow hash function will always take longer to calculate than a faster (computationally less complex) one. Whether the result is stored in a rainbow table or not, is unrelated to the hash function's performance. – Steven Volckaert Apr 28 '14 at 11:45
  • @Steven, yes I understand that. My point was more once the rainbow table has been generated (which will take longer for a slower hash), the speed of use will be the same as any other rainbow table, so in the case of rainbow table attacks, a slow hash will not give increased security. – Jay Apr 28 '14 at 11:51
  • 1
    @Jay That's correct. It seems strange to me that the entries in the `.htaccess` file don't contain salt values, which would provide protection against rainbow attacks. – Steven Volckaert Apr 28 '14 at 12:23
  • 1
    The `.htaccess` files _do_ contain salt values. For instance, in this value: `blah:$apr1$FVk/v.Hx$s6bxoKMoIVK8RO26rR.2w1` the salt is the part between the second and third '$' signs (`FVk/v.Hx`). Salts defeat precomputed tables (be they rainbow or not). – Thomas Pornin Apr 28 '14 at 12:55