33

In security news, I faced a new term related to hash functions: it is reported that the "in-house hash function" used in IOTA platform is broken (i.e. Curl-P hash function). You can find the complete paper introducing the vulnerability here.

But I do not understand if the term of "in-house" represents a specific type of hash function? And in general, what does "in-house" mean here?

Anders
  • 65,052
  • 24
  • 180
  • 218
Questioner
  • 1,297
  • 2
  • 10
  • 14
  • 71
    I believe the definition of "In-house hash function" is "Don't use our product, we have no idea what we're doing". (see Steffen's answer for a less cheeky response) – Mike Ounsworth Nov 21 '18 at 14:22
  • 36
    It means the same thing that "homeowner wiring" means to electricians. – Eric Lippert Nov 21 '18 at 20:23
  • 13
    @EricLippert Be fair. There's less of a chance of a catastrophic fire and loss of life with homeowner wiring. – Nic Nov 21 '18 at 22:03
  • 3
    “in-house” is a term which, if heard several times at an interview causes one to lose desire to be offered the job. Generally speaking, “in-house” can be seen as the polar opposite of “industry standard”. – Mawg says reinstate Monica Nov 22 '18 at 13:03

3 Answers3

92

From the explanation of in-house in the Cambridge Directory: "Something that is done in-house is done within an organization or business by its employees rather than by other people".

Here it means developing your own hash algorithm instead of using a public one. Usually that means that it is developed by only a few people with only limited expertise in the problem area and without any public input. Thus it is very likely that the self-developed one gets eventually broken once more experts in cryptography take a look at it.

See also Why shouldn't we roll our own? and How valuable is secrecy of an algorithm?.

Steffen Ullrich
  • 190,458
  • 29
  • 381
  • 434
  • 76
    Hearing "in-house" with anything security related is always a HUGE red flag. – Marie Nov 21 '18 at 16:35
  • 8
    And hearing "in-house" with anything crypto related is an even larger red flag even more often. – Maya Nov 21 '18 at 21:47
  • 1
    Also see [Is my developer's home-brew password security right or wrong, and why?](https://security.stackexchange.com/q/25585/46979), although I suspect Dave's algorithm was *much, much* worse than IOTA's. – jpmc26 Nov 21 '18 at 21:49
  • In short, in this context, "in-house" means "trouble"! – Muzer Nov 22 '18 at 09:53
3

In the context of cryptography "in-house" is a synonym for "questionable origin and unverified strength".

It specifically means that they developed their own hashing function (or in other cases encryption, key-exchange scheme, etc.).

This, in cryptography, is a Bad Idea with capital letters. While developing your own library of common functions or your own webservice framework or whatever can have a perfectly good use-case, cryptography is one of the fields where a tiny mistake can make the whole thing incredibly fragile in a way that you will never find out. If you build your own webserver ("our high-performance in-house webserver...") and there's a problem, you have a good chance of finding out sooner rather than later because it crashes, or sends the wrong files, or performs badly. But if your crypto algorithm has a problem that destroys its cryptographic strength, you have to be very lucky that someone who breaks it actually tells you. The people who try to break it are almost certain to be attackers, because very few cryptographers waste their time on some in-house crypto hack. They know to stick with public algorithms where it actually matters if they find something, to more than one company.

Tom
  • 10,201
  • 19
  • 51
-14

I agree with the answer given an hour before this one about in-house meaning, "non-standard and probably not very sophisticated or rugged." There may still be one argument in favor of using an in-house hash. That is, it may be different enough from the standard ones out there that a hacker may decide it is too much work to figure out how to reverse engineer it. Even if you accept this argument, this sort of do-it-yourself approach should only ever be used to protect very low-value data.

  • 31
    This line of reasoning - relying on an attacker not knowing the implementation details and hoping that they won't find out - is named "Security by Obscurity". Since this is only a slowdown, rarely really a barrier, it is generally frowned upon and strongly recommended against. One area where it still does make sense is if your assets drop rapidly in worth in a short time (days to months) and an attack after one year is much less of an impact. Most game and movie DRM / copy protection schemes fall under this category. – Zefiro Nov 21 '18 at 18:21
  • 8
    It's almost a trope at this point that whenever a "why is X bad?" question is posted, there will be an answer that says, paraphrasing, "well it's not *that* bad, because at least the attackers won't know how your version works". In other words, an appeal to Security by Obscurity. And yes, I get that this answer doesn't reject the premise, but others have done. – Tom W Nov 21 '18 at 19:04
  • 1
    @Zefiro: There is a difference between "hoping that an attacker doesn't find out", and considering the cost/benefit ratios for possible attacks. To be sure, the risk of a popular encryption or hashing scheme being defeated may be slight, despite the level of research into attacks on them, but the probability of a scheme no prospective attackers would care about being defeated might be even lower. – supercat Nov 21 '18 at 19:52
  • 2
    @Zefiro This reasoning is not about an attacker not knowing the implementation details. I think you need to read it again. It's about an attacker not having sufficient motivation to bother trying to break the algorithm. (It's still wrong, but not because it's security by obscurity, because it's not security at all. The problem is that you're very likely to drastically overestimate how much security you actually have, and that's exactly what happened here.) – David Schwartz Nov 21 '18 at 20:14
  • There is one argument in favor of using an in-house hash function, and it has nothing to do with stopping hackers: if you're trying to protect against random data corruption, you can tune it to your specific needs (eg. detecting paired bit-flips, or fitting into the 37 unused bits of your data file, or whatever). Since your adversary in this situation is unintelligent, it's not hard to be smarter than them. – Mark Nov 21 '18 at 21:36
  • 2
    @supercat The problem is that normally, it's _just_ security by obscurity. If you, say, shuffle an otherwise secure hash before storing (and unshuffle on retrieval and comparison), sure, that does extremely little for security, but it doesn't _harm_ it. If _all_ that's done is the shuffle, that's... very bad. And it's the latter case that's normally done, not the former. So people say "don't ever use it", because that's a good rule of thumb, and once you're experienced enough to know the difference, you're experienced enough to know when to ignore rules of thumb. – Nic Nov 21 '18 at 22:06
  • 1
    While I agree that you don't always have to follow recommendations like dogma, for this case it would be silly to spend development time to create a custom hash when there are so many (exotic) hash functions out there, like Scrypt or Whirlpool. It would be like a carpenter making their own hammer instead of buying one. – Chloe Nov 22 '18 at 00:44
  • @Chloe yes, can you imagine this being put in real-world scenario? An electrician who comes to fix your wiring and has made his very own wires and uses some other units instead of watts, amps and volts? Or a building company that uses their own proprietary bolts, fixings, and concrete substitute with unknown properties. All you know is that they've been researched and developed by the construction workers. I'm now amused by the thought of a plumber who has custom tools that only work with his own custom pipes. – VLAZ Nov 22 '18 at 09:37
  • Consider this article: https://www.schneier.com/essays/archives/1999/11/dvd_encryption_broke.html The DVD industry surely had good general purpose software engineers. They rolled their own crypto, and it was broken within a day by 2 independent groups. That crypto scheme was in hardware, unchanging, which is why ripped DVDs are all over the internet and probably always will be, or at least as long as some people still use DVDs. Never roll your own if security matters to you. – WDS Nov 22 '18 at 16:42
  • @NicHartley Except, even if harmless in the abstract, in reality there are many risks such a "harmless" addition adds. There is the possibility of bugs, exploits, or side-channels in the additional code. It also increases the computational cost which can increase the ease of denial of service. Then there are costs to developing this extra code as well as verifying that it actually is harmless even if the code is completely correct. – Derek Elkins left SE Nov 22 '18 at 19:04
  • 1
    In this case, there's a direct financial incentive to work on the problem (it's a cryptocurrency, so that's where the money is), which means it probably won't be "too much work" for someone to spend time trying to attack it. Security through obscurity isn't inherently the worst possible thing in the world (that's a far cry from being good) if something is truly obscure and will always be obscure, but a system that can straight-up pay you money if you compromise it isn't going to be obscure for long. – Zach Lipton Nov 22 '18 at 20:36