When a password manager or security tool tells you a password is "weak" or "strong", it's usually measuring entropy. Here's what entropy means, how to calculate it, and why it matters more than complexity requirements.
What is password entropy?
Entropy, in this context, measures the unpredictability of a password — specifically, how much information an attacker would need to guess it. It's measured in bits.
The formula:
entropy = log₂(pool_size) × password_length
Where pool_size is the number of possible characters.
For an 8-character password using only lowercase letters (26 characters):
entropy = log₂(26) × 8 = 4.7 × 8 ≈ 37.6 bits
For an 8-character password using lowercase + uppercase + digits + symbols (95 characters):
entropy = log₂(95) × 8 = 6.57 × 8 ≈ 52.5 bits
Higher entropy = more guesses needed to crack = stronger password.
Why length beats complexity
This is where most password policies get it wrong. Mandatory complexity rules (must include uppercase, lowercase, number, symbol) sound strict but often produce predictable patterns: Password1!, Summer2024!, Company@123.
Compare two passwords:
-
P@ssw0rd!— 9 characters, meets most complexity requirements, ~35 bits of actual entropy (because the pattern is predictable) -
correct-horse-battery-staple— 29 characters, only lowercase and hyphens, ~152 bits of actual entropy
A random 12-character password using all character types has about 79 bits of entropy. A random 4-word passphrase from a 7,776-word dictionary (like Diceware) has:
log₂(7776) × 4 = 12.9 × 4 ≈ 51.7 bits
...but it's far more memorable.
The cracking time perspective
Entropy translates to guessing time. Modern hardware can test about 100 billion MD5 hashes per second. With 37.6 bits of entropy:
2^37.6 ≈ 200 billion combinations
200 billion / 100 billion per second = 2 seconds
A 37-bit password can be cracked in seconds offline. With 52 bits:
2^52 ≈ 4.5 quadrillion combinations
4.5 quadrillion / 100 billion per second ≈ 12.5 hours
With 80 bits (roughly what a good random 12-character password with full character set gives you):
2^80 ≈ 1.2 × 10^24 combinations
1.2 × 10^24 / 10^11 per second ≈ 380 billion years
At 80 bits, cracking the password by brute force is computationally infeasible with any foreseeable technology.
The practical threshold: aim for at least 80 bits of entropy for passwords you want to be secure long-term. For low-stakes passwords, 50–60 bits is usually fine.
The difference between theoretical and actual entropy
The formula assumes the password is chosen uniformly at random from the full pool. Human-chosen passwords have much lower actual entropy:
- Substitutions are predictable: users replace 'a' with '@', 'o' with '0', 'e' with '3'
- Appending numbers is predictable:
password1,password123 - Capitalizing the first letter is the most common pattern
- Adding
!at the end is the second most common appended character
Password cracking tools use these patterns first. A password that looks like it has 60 bits of theoretical entropy may have only 20 bits of practical entropy if it follows predictable rules.
This is why a truly random password generated by a computer (not chosen by a human) is far stronger than a human-chosen password that "looks" complex.
How our password generator measures entropy
The Password Generator shows the entropy of each generated password and a strength indicator:
- Under 50 bits: weak (offline cracking possible)
- 50–60 bits: okay for low-risk accounts
- 60–80 bits: good
- 80+ bits: very strong
The generator uses crypto.getRandomValues() — the Web Cryptography API's cryptographically secure random number generator. This ensures the passwords have the theoretical maximum entropy for their length and character set.
Bcrypt, scrypt, Argon2: slowing down the attacker
Even with a strong password, the cracking speed depends on how the password is stored. With MD5 or SHA-1 (common legacy choices), attackers can test 100 billion guesses per second. With bcrypt (cost factor 12), that drops to about 1,000 guesses per second.
This makes the entropy math dramatically better. At bcrypt cost 12:
- 50 bits of entropy → 2^50 / 1000 ≈ 35 billion seconds ≈ 1,100 years
Never store passwords with MD5, SHA-1, or plain SHA-256. Use bcrypt, scrypt, or Argon2. These are specifically designed to be slow, resisting bulk cracking even when the hash database is stolen.
Password entropy in practice
For users:
- Use a password manager (LastPass, 1Password, Bitwarden) — it generates and stores truly random passwords
- Aim for at least 16 characters for any important account
- Each password must be unique — reuse compounds the damage from any single breach
For developers:
- Store passwords with bcrypt (cost ≥ 10) or Argon2id
- Never enforce maximum password length (it prevents long passphrases)
- Enforce minimum length (12+ characters) rather than complexity rules
- Check against known breach databases at login (e.g., HaveIBeenPwned API)
Password security is ultimately about making the attacker's job computationally expensive. Entropy tells you how expensive. The combination of high-entropy passwords (generated randomly, not chosen by humans) and slow hash functions (bcrypt, Argon2) is what makes credential cracking economically infeasible, even after a database is stolen.
Top comments (0)