25

Quoting from CompTIA Security+ guide

The first factor of authentication (something you know, such as password or PIN) is the weakest factor.

Why? it makes sense when we say that humans/users are the weakest factor in any system from security point of view as we humans forget, make mistakes and break easily. But it makes no sense (to me a least) that getting kidnapped and tortured (in order to give up my password) is more likely to happen than me losing a smart card or a key fob?

enter image description here

Ulkoma
  • 8,793
  • 16
  • 66
  • 95
  • Can you link to the guide? – RoraΖ Oct 08 '14 at 15:18
  • http://www.amazon.co.uk/CompTIA-Security-Certified-Ahead-SY0-301/dp/1463762364 – Ulkoma Oct 08 '14 at 15:18
  • Maybe by weakest they mean volatile? – RoraΖ Oct 08 '14 at 15:24
  • I think you just technically said you'd give up your "something you know" before being kidnapped and tortured. So, in your case, it would be the weakest factor. – sup Oct 08 '14 at 17:19
  • Not true, I would not reveal the information till you pull a knife on me – Ulkoma Oct 08 '14 at 18:45
  • 1
    ...which is likely to happen /before/ being kidnapped and tortured. – user2338816 Oct 08 '14 at 23:45
  • It's weakest because people regularly **choose** weak passwords. The image shows continuation that discusses how to reduce that weakness (increase strength). – user2338816 Oct 08 '14 at 23:48
  • Because it can be "captured" and replayed without your knowledge, where as a good NFC-based token isn't replayable (you can't copy it, you need to physically steal it just like a smartcard). –  Oct 09 '14 at 08:21
  • 1
    I think the last sentence of the boxout is bunkum - see http://security.stackexchange.com/a/4705/9829 – naught101 Oct 09 '14 at 10:03
  • "Users should be forced..." This does not sound wise. Also, the first point isn't great advice either. (http://xkcd.com/936/:) – Ajasja Oct 10 '14 at 20:05
  • The second point is really, really, REALLY bad. – Malavos Aug 20 '15 at 18:36

5 Answers5

43

In the typical case, something you are and something you have can only be true for one person at a time. If you lose your token, you know you have lost it.

Something you know can be copied by someone without your knowledge. If someone has your password, you may not be able to tell that they are actively exploiting that knowledge.

That is one reason to change your password regularly. It shortens the window where a password breach could be exploited.

Gene Gotimer
  • 1,455
  • 11
  • 11
  • 15
    Something you know is also the easiest to guess or otherwise obtain. Users can still make weak passwords and reuse them or variations on them. – Paraplastic2 Oct 08 '14 at 15:38
  • 3
    Something you know can be attacked at any time by anyone anywhere in the world. The attacker may or may not need to involve you directly when they obtain this information. On the other hand, something you own can be attacked only from one location on the entire planet, and you would need to make a significant mistake or suffer tremendous misfortune for such an attack to be successful. And of course, combining both forms just multiplies the probabilities of victimization to arrive at an even smaller probability of victimization. – Keen Oct 08 '14 at 16:52
  • 4
    Also, "something you know" can't be taken away. When you want to revoke access to someone, you can take away a key, but you can't force someone to forget the hardcoded master password. – Philipp Oct 08 '14 at 17:50
  • 2
    It shouldnt be forgotten tough that it is possible to reproduce fingerprints (something you have/are). In the case of the iPhone fingerprint reader it has been proven that a weak implementation (weak fingerprint reader) is not really stronger than a password on its own. In my book passwords are the best solution for humans to use and should be complemented by a second factor whenever possible. Ranking different factors against each other seems beside the point to me... –  Oct 08 '14 at 19:32
  • 1
    The key point is "without your knowledge". You know if someone steals your RSA keyfob or cellphone or one-time-use-pad — you don't know if someone steals your password or PIN. – Greenstone Walker Oct 08 '14 at 20:18
  • "it is possible to reproduce fingerprints" @SebastianB. that's only the problem with implementation. Weak fingerprint reader is like a password box that only cares for first N characters (yes, I encountered this once...). Good reader also measures blood flow and temperature to make sure it's a living finger, and performs few more tests. – Mołot Oct 09 '14 at 07:31
  • The tests you speak off are easily bypassed by placing the fake fingerprint on a real finger. Which conveniently for the attacker is a lot easier to do than making a fake finger. – Bruno Rohée Oct 09 '14 at 10:47
  • +1 for "Something you know can be copied by someone without your knowledge.". Information is volatile. @Mołot Amazon had that once. Only checked the first ten characters or so. – dom0 Oct 09 '14 at 22:58
  • @GreenstoneWalker what about stealing Tokenseeds? Either when the Tokens(seeds) are still in transit to the customer or lets say a malware on you smartphone lets me steal your tokenseed from your otp "app". That would have the same effect if I would get my hands on your passwords. You wouldnt notice it. –  Oct 10 '14 at 08:09
25

Passwords, or more generally something you know, are often relatively weak, because users cannot remember high-entropy secrets. As a result, passwords (or anything you need to memorize) usually ends up being a low-entropy secret, which enables random guessing, offline dictionary search, and other attacks. While it's possible to create and remember a pretty good password, experience shows that users don't -- and that it is probably unreasonable to expect users to do so.

There is a tremendous amount of academic research and practical experience that backs up this statement. Here are some example references:

In addition, any secret you know can potentially be phished (i.e., someone might be able to social-engineer you into revealing it).

Remember the classic statement:

Humans are incapable of securely storing high-quality cryptographic keys, and they have unacceptably slow speed and accuracy when performing cryptographic operations. (They are also large, expensive to maintain, difficult to manage, and they pollute the environment. It is astonishing that these devices continue to be manufactured and deployed. But they are sufficiently pervasive that we must design our protocols around their limitations.)

Charlie Kaufman, Radia Perlman, Mike Speciner, Network Security: Private Communication in a Public World.

At this point you might be wondering: Given the passwords have so many issues, why do we still use them? If so, I recommend you take a look at this question: Why do we even use passwords / passphrases next to biometrics?.

D.W.
  • 98,860
  • 33
  • 271
  • 588
  • 1
    "It is astonishing that these devices continue to be manufactured and deployed." They're purpose generators. We can't get any utility out of our other machines without these components. But yes, they *are* terribly inefficient. I hope someone is working on a replacement for this outdated technology. – Keen Oct 09 '14 at 16:43
  • Strange that these fallible "humans" tend to produce devices that are so unlike themselves. Then they struggle to work with them! Perhaps they should have stayed with producing only other humans. But they are problematic to themselves also... –  Oct 13 '14 at 18:19
8

It is possible to have a "something you know" which you cannot be forced to disclose. I read about a secure login system which presents a grid of 12 to 15 photos of faces, and you have about 3 seconds to touch the 3 or 4 that you have seen before. For this to work, there must be a database of many thousands of photos, and you train on hundreds of them. The system knows which ones you have trained on. (you give a user name first which is assumed to be "public" - not secure.)

You cannot possibly convey this info to another person, and any success in one login conveys zero success in another. For "something you know" to be effective, we should simply use aspects of human nature that work effectively. Most people can recognize previously seen face photos - it is built in.

  • What do you mean you can't convey this info? The attacker just needs to point at a face and ask "is this one of your faces?" I think you mean it would be difficult to verbally describe a particular face with enough detail that an attacker could identify your choices. – PwdRsch Oct 08 '14 at 18:22
  • 1
    @PwdRsch: You would need to train the attacker so that they could answer the challenge in 3 seconds when it was presented. For that, the attacker would need the entire database of faces. New faces could constantly be added with no difficulty, confounding any efforts to "teach" it to someone else. Similarly, faces that you know could be removed. If you "disappeared" (presumably you are actually performing work if you have such a high-security login) then your entire user ID and set of faces could be marked as compromised, aiding the search for you. It is simply impractical. –  Oct 08 '14 at 18:39
  • 3
    It seems interesting, but I don't understand. The database has many thousands of faces and I train on hundreds of them. So I am supposed to remember several hundred faces and if I forget a face I can't login (or at least I have to retry). – emory Oct 09 '14 at 00:35
  • 3
    Assuming you are presented with 15 faces, must choose the 4 you know, the subject logs in almost every day, and that the system locks up after 3 failed attempts. A patient attacker could reasonably expect to make the first unauthorized login through daily random tries in about 4 years. If the attacker is collecting the faces, then each login attempt (whether failed or successful) provides the attacker with useful information for the next login attempt. It is viable if there is a lockout for repeated failure and new faces would need to be added at least annually. – emory Oct 09 '14 at 00:58
  • 1
    curious concept, though as @emory stated, all an attacker needs to do is remembering the faces presented and after a while the most commonly occurring ones are candidates for a very successful attack – Tobias Kienzler Oct 09 '14 at 09:27
  • One built-in limitation of the system is that after one has seen a face in a login attempt, it might begin to seem familiar (people are not always good at discerning context of face memory - eg: "photo I trained on" vs face simply seen repeatedly some other way) so the knowledge "degrades". My point was only that a system HAS been built where the "something you know" is not really steal-able or discloseable as such. If one such system is possible, so are others. I recognized a restaurant I had not been to in 30 years from the smell of a piece of food. People can recognize loved-ones by smell. –  Oct 09 '14 at 13:03
  • 2
    @NoComprende I think a critical problem is that it is hard for me to validate what you know without me also knowing it. With the faces, the database knows which faces you have trained on. When the attacker manages to steal the database, the system is critically compromised. In classical password authentication, the system does not know your password. – emory Oct 09 '14 at 13:24
  • 2
    Maybe we should add a security category: Something Nobody Knows : ) Totally airtight. –  Oct 09 '14 at 13:31
8

This is the result of the excellent marketing done by biometric authentication vendors.

"Something you are" is sometime very easy for an attacker to reproduce, fingerprints and voice are especially easy to obtain, without the possibility for people to use credible strategies to avoid it (wearing gloves at all times and not speaking in public is not practical).

Most of us likely leave dozens of exploitable fingerprints everyday, I for one don't say my password out loud nearly that often.

"Something you have" is not without faults either, and requires a great deal of user education to be used properly. E.g. in any RSA SecureID company, a tour of the office will reveal many of them on desks, the code being visible. I even saw people carrying them around their neck with them. Also the disappearance of an authentication token may not be noticed until it is needed.

Bruno Rohée
  • 5,351
  • 28
  • 39
  • "something you are" is however more complex to reproduce than a mere password / PIN, and depending on its nature potentially volatile in existence. And there is of course also the "something you have", a difficult-to-clone piece of hardware – Tobias Kienzler Oct 09 '14 at 09:28
  • In the case of fingerprints they are trivial to reproduce, as in, kid workshop trivial. The gummy bear paper (http://cryptome.org/gummy.htm) is now 12 years old and should be common knowledge by now. Most other biometric technologies can be fooled too. – Bruno Rohée Oct 09 '14 at 10:24
  • 1
    At a certain point it might however be easier to simply bypass the scanning mechanism itself. Nonetheless I agree that biometric isn't something one should rely on (at least not exclusively) – Tobias Kienzler Oct 09 '14 at 10:36
  • 7
    Biometrics are identification, not authentication. That the conjunction of good marketing and "innocence" of some decision makers led to biometrics being used to authenticate people is really a failure of our industry. – Bruno Rohée Oct 09 '14 at 10:43
  • 100% agreed upon – Tobias Kienzler Oct 09 '14 at 10:45
  • The "something you are" is often secure if one is trying to authenticate oneself directly. Unfortunately, in most real-world applications it's not possible for the entity that knows the authorized person to examine a person who is seeking access; instead, it's necessary for the entity to rely upon third-party descriptions of the person seeking access. – supercat Oct 09 '14 at 19:41
1

The biggest (and only) problem of password based authentication is that its strength is set by the user, and there is a strength-usability tradeoff. In theory, passwords have lots of great properties for an user: they are hard to obtain from third parties by force, you don't leak them everywhere like biometric data, they can be stored as hashes unlike biometric data, users can stay anonymous to the service (if you share your mobile phone number for 2 factor or give them your fingerprints you lose that), users can tell the password other people, and share access. Try that with biometrics, or key fobs. So passwords offer more control to the user, which can be an advantage, but most times is a disadvantage.

Then there are problems in implementation. First you need to set a completely distinct password for every place you have to use a password. This is because most times the other party gets your password in plaintext. When you use a strong authentication scheme like SCRAM-SHA1, you never transmit the password to any third party, and can use a scheme like "strong password" + "website name" for your password. This usage of passwords however is very unclear to separate. You can create a small device where users enter their passwords into, which guarantees that the password never leaves it, but the user can be spoofed by the computer and think they have to enter it there. There it is again, the strength relies on the user.

user10008
  • 4,355
  • 21
  • 33
  • 1
    Anonymity wouldn't have to be any worse with a key fob than with passwords. If a user can select a desired initialization vector for whatever sort of hashing algorithm the fob is using, the same fob could serve any number of different accounts without the generated codes being recognizable as having come from a single fob. Further, suitably-designed key fobs could offer more security with regard to passwords, since it would be possible to have one key fob give another key fob information it could use to say "I am Y. Here is proof that X said Y should have access to X's account until May 3". – supercat Oct 10 '14 at 17:52
  • I wish more systems would provide a means by which a user "fred" could easily ask for the creation of login ids "fred.1", "fred.2", etc. which would have some specified subsets of abilities and independent passwords, and which could easily be enumerated by, and individually revoked by, the owner of account "fred". – supercat Oct 10 '14 at 17:55