Even unimportant sites should use a reasonably strong password-hashing scheme IMO. People often use the same password for many sites, so a security breach on even an unimportant site can hurt a lot of people.
This. Sadly password reuse is a problem and sites shouldn't pretend it isn't. Also humans generally have a problem coming up with high entropy passwords. If someone used a particular password even once the odds are someone else on the planet also used it. Without salt precomputation against known/compromised passwords becomes trivially easy.
At a minimum:
a) modern cryptographically secure hashing algorithm with no known preimage attacks (second generation RIPEMD, SHA-2, SHA-3, bcrypt*, Scrypt, Whirlpool, etc)*.
b) 64 bit or greater salt.**
c) hash length of at least 128 bits
d) enforce minimum 8 digit password length ***
An even stronger solution is:
a) use a key derivative function designs to slow down brute force attacks (key stretching). Examples include bcrypt, scrypt, and PBKDF2 ****
b) enforce minimum password length 8 digits is acceptable for higher security applications adding even a single digit (9 digits) can provide significant security ***
c) check users password against lists of known compromised passwords and reject.
For example using bcrypt, requiring a min of 9 characters and ensuring the password isn't on any compromised password dictionary list makes the probability of brute forcing the password negligible even using botnets, cloud computing, or dedicated (non-existent) ASICs. It is also likely to remain negigible even considering the advancements in computing power over the next couple decades. For a more exotic solution which provides the site plausible deniability and puts all the security requirements on the user one could use public key signing (Bitcoin address or PGP) as a method of authenticating (logging on) users.
For those who want an appeal to authority this is what NIST recommends as a minimum:
a) Key Derivative Function: PBKDF2 key using SHA-2 (SHA-3 maybe? but not at the time of this doc)
b) Min salt length: 128 bits
c) Min digest (hash) size: 112 bit
d) Min number of iterations: 1,000 for time sensitive applications (for high security situations that are not time sensitive a much higher iteration count based on available computing power should be used potentially up to 1,000,000 iterations)
e) Min password length: 10 digits for passwords which should consist of mixed symbols, numbers, upper case, lower case (i.e. "D&Twtf?123")
f) Min passphrase length: 30 digits which can be case insensitive alphabetical only (i.e. "my name is death and taxes and death and taxes is my name")
Understand NIST is a US government agency so their exclusion of an algorithm doesn't mean the algorithm is insecure it just means that governments like everything in nice neat packages. Still there is nothing wrong with following NIST requirements, they just are a little restrictive.
Reference NIST publication 800-132 (Dec 2010)
http://csrc.nist.gov/publications/nistpubs/800-132/nist-sp800-132.pdfAnother potential source for "how to do it right" is the Bitcoin wallet source code. The Bitcoin wallet doesn't store passwords but it does derive the encryption key from the user supplied password.
It uses PBKDF2 using SHA-2, 256 bit key, tens of thousands of iterations (exact # depends on computing power of wallet).
Notes:
* The entire MD series of cryptographic hashes and SHA-0 are horrible insecure at this point and no new system should even consider them. Legacy systems should have implemented hashing algorithm upgrades roughly a decade ago. SHA-1 is cryptographically weakened but faster than brute force preimage attacks against the hash are likely more expensive than brute forcing the passphrase in all but the strongest passwords. Still given the number of secure alternatives no new project should deploy SHA-1 at this point.
** NIST recommends 128 bit although that likely is future proofing. As long as salt is reasonably random and used on a per user basis even 32 bit salt will prevent the attacker from performing any precomputation or parallel attacks.
*** One problem with SHA-2 and similar algorithms is that they are designed to be very fast. A single high end GPU can perform a billion hashes a second (remember in Bitcoin "1 GH/s" is
2 billion SHA-256 hashes). This is useful in some applications like HMAC where you need to sign every packet individually as this may mean millions (or potentially hundreds of millions) of packets a second. On the other hand this speed works against password security. Unless your website needs to login millions of users per second, every second until the end of time that high speed offers no advantage but it does offer the attacker to attempt a massive number of potential passwords each second. Strong key derivative functions provide a mechanism for increasing the amount of computing resources necessary to complete a single hash. If you make a hash take 1000x as long it has a negigible impact on a webserver but it cuts the throughput of an attacker by 1000x. Imagine an attacker with a given set of resources could break a particular passphrase in 9 hours, 1000x is one year.