To strengthen a password hash, you need to do two things to the hashing process:
- to make it unique;
- to make it slow.
"Unique" means that each password should be hashed with its own hash function; which hash function is used can be some public information (i.e. stored along the hash value), but you want to make it different for each password you hash. This is what a salt does: the salt defines the hash procedure to use, among a wide family. Uniqueness defeats rainbow tables: rainbow tables, like all other kinds of precomputed tables, relies on the idea that it is worthwhile to spend some time hashing a lot of passwords and storing the hash values (possibly in a smart way which allows for some extreme compression, like rainbow tables), because the resulting table can be used to attack several passwords, with a very small per-password marginal cost. With salts, each password has its own function, so the table would be good for at most one password only, which destroys its advantages.
Hashing "1000 (or more) times" the password does not include a salt, and, as such, is vulnerable to precomputed tables. For instance, assuming SHA-256 as hash function, here is your hashed password:
f3f19029aa4ef4bde723f49b4e90a7ad51473c54a03589af6fef706bf50d7894
That's the SHA-256 hash of 1000 'a' characters. I could precompute that value because once you have said "1000" you have said it all; no salt, hence no surprise for the attacker.
Slowness is about making each password guess as expensive as possible for the attacker. Even with good salting, an individual hashed password can be vulnerable to brute force, aka "dictionary attack" (trying potential passwords), because humans are not nearly as imaginative as computers are powerful. A PC with a GPU can compute a hash function a billion times per second or so. We want a hashing procedure which takes more time to compute -- not too much, because our honest server will also hash passwords when a user logs in, and we do not have infinite CPU either; but we just need to hash, at most, a dozen passwords per second, so we can tolerate a substantially slow hash function.
Usually, slowness is obtained by mandating nested hashing: we hash the password, then we hash the resulting hash value, which we hash again, and so on. There are a few tricky details about how and where the salt is inserted. Hashing the concatenation of 1000 times (actually, with the figures above, 1 million times would be better) the password (or the concatenation of the password and the salt) could serve the same purpose, but it is a bit delicate in practice: indeed, we want to configure the number of repetitions of the password so that the procedure is tolerably slow. But with such a system, hashing a 40-character password takes 40 times the time of hashing a 1-character password; if the latter must be slow, the former will be 40 times as slow, which soon becomes intolerable. With nested hashing, it is easier to get constant hashing time, which makes configuration easier.
And, of course, homemade is Bad. Inserting a password and a salt in a hash function, iterated and/or nested, is subtle; there are pitfalls, and, worst of all, you cannot know whether you did it wrong or not. Security cannot be reliably tested. So, you should stick to published and widely deployed standards of good repute, such as bcrypt and PBKDF2. If only because it means that any mishap will not be your fault.