11

Entropy is a term used often in relation to password security and brute-force attacks, but it is a topic that can get complicated quickly. What is the best way to describe password entropy (what it is and how it's calculated) in terms a layman can understand?

jrdioko
  • 13,071
  • 7
  • 30
  • 38
  • You could use some clue from a previous answer of me : [Should passwords be truly random?](http://security.stackexchange.com/questions/6497/6504#6504) – M'vy Aug 29 '11 at 23:26

4 Answers4

22

Not sure can it be of any help to you, but once I managed to describe entropy to a child.

After I said that entropy is a measure of chaos in system (to a group of people), a 12 (year more or less) year old said he doesn't understand me. I replied with - "Well, when your room is untidy, entropy is high. But when you clean your room, entropy is low - everything is in order then. So, when a thief comes to your room trying to steal your homework, when the room is clean and entropy is low, he will easly find it - it's usually on your desk or in your school bag. On the other hand, when your room is untidy and your homework is somewhere laying around, thief doesn't know exactly where it is and it can't find it quickly. If entropy is high enough (let's say roof collapse), it's almost impossible to find a piece of paper in it.

High entropy - finding a needle in a haystack.

In the game - Guess who I am, at the begging entropy is very high, but after few questions entropy is lower and lower until someone has enough information to guess who the person is (or in security, after trails (and errors) and trials, to guess what is the password).

StupidOne
  • 2,812
  • 22
  • 35
  • 11
    Huh, you were arguing to a 12-year old that cleaning the room shouldn't be done? – Paŭlo Ebermann Aug 29 '11 at 20:09
  • 1
    Well, when you want to explain some complex concept to a kid, you have to use something (s)he is familiar with. At that time, I couldn't think of any better example. But anyway, it was that or Shannon... I took my chances with cleaning the room. – StupidOne Aug 29 '11 at 20:16
  • 10
    I just would have left out the thief, and used "when you want to find your homework". That way low entropy is good :-p – Paŭlo Ebermann Aug 29 '11 at 20:19
  • 1
    I think the argument is that if you keep your room tidy, it is more plausible that your homework was, in fact, stolen. – tdammers Oct 16 '11 at 13:27
  • I would like to add to this analogy, that entropy says something about how probable a state is. (Wahrscheinlich is better, but that word is not available in the English language.) When cleaned the room has a low entropy, thus if the said 12 year old takes a book, reads it and place it just next to him, the probability that it will put it back in the correct place is extremely small. There are more places, where it doesn't belong then where it belongs. Thus the entropy of the place will keep increasing due to normal day to day living, until you put some energy in it to lower it again. – Edgar Klerks Nov 23 '15 at 12:32
7

I would skip the idea of entropy entirely and just talk about how hard it would be to guess the password. Basically, tell him that password entropy is a measure of about how many passwords a person would have to guess before they hit on yours.

A password like "password" or "123456" would practically be someone's first guess. A password like "6love" might be in their first few thousand guesses. A password like "1974!jhT" would take billions of guesses.

David Schwartz
  • 4,233
  • 24
  • 21
7

The layman's part comes later, but first, let's get scientific.

I struggled to understand the concept of mathematical entropy, but, lucky for me, I work with an engineer. When I asked him to explain it, he directed me towards the graphs of two laws: uniform distribution and normal distribution.

Knowing the Y-axis describes a measurement of probability of guessing the password, and X-axis describes a value of what the password is... in a uniform distribution (see diagram), the entropy is high, because no matter what the password is, the probability of guessing it is the same value (statically the same chance). In a normal distribution (see diagram), the probability of guessing the password changes... for instance, you're ~70% likely to be able to guess the password (sort of).

For the layman.... Where's Waldo/Wally?

High entropy:

  • You're trying to find Waldo by identifying what he's wearing.
  • You are given 100 people.
  • 100 of these people are dressed like Waldo, including Waldo.
  • Where is Waldo?

∴ You have a very little chance of identifying Waldo.

Low entropy:

  • You're trying to find Waldo by identifying what he's wearing.
  • You are given 100 people.
  • 33 of these people are dressed like Mark, 33 are dressed like John, 33 are are dressed like Tom, and 1 is dressed like Waldo (and is Waldo).
  • Where is Waldo?

∴ You have a very high chance of identifying Waldo.

brandeded
  • 333
  • 2
  • 9
5

For a password, "entropy" is related to "guessability". Low entropy is easier to guess, high entropy is harder to guess.

The primary aspect is randomness. A non-random password, like "secret", is easily guessed. A random password, like "8jh$#F" is harder to guess.

Length plays a part, too. "secret of a lifetime man" is not much more random than "secret", but the additional length means it is harder for a brute-force attack to crack, and therefore it has better "entropy".

So, to a layman: A good password has high entropy, which means it is hard to guess, uses multiple character types (upper-lower-number-symbol), and the longer the better.

gowenfawr
  • 72,355
  • 17
  • 162
  • 199