60

I read today about the CCleaner hack and how code was injected into their binary. People were able to download and install the compromised software before the company had noticed.

Isn't this what digital signatures are for? Would signing the binary or providing a checksum have done anything to prevent this?

To add to the confusion, in this Reuters news article a researcher claims they did have a digital signature:

“There is nothing a user could have noticed,” Williams said, noting that the optimization software had a proper digital certificate, which means that other computers automatically trust the program.

How could the OS accept to install a software with an invalid signature? Or can an attacker change the binary and forge the signature?

Stevoisiak
  • 1,535
  • 1
  • 12
  • 27
  • 12
    I think you answered your own question in your quote "*had a proper digital certificate*" - it wasn't invalid. – Chenmunka Sep 18 '17 at 13:54
  • 2
    @Chenmunka but how does tampering with the binary not cause it to become invalid, could the hacker have made a new signature? –  Sep 18 '17 at 13:56
  • 7
    @blackbird It depends on what was the actual attack vector. If they managed to get the malware into the code repository, it is understandable (kind of), that no one noticed immediately. – glglgl Sep 18 '17 at 14:37

2 Answers2

94

Based on the incomplete details that have been released so far, the malicious code was inserted before compilation and signing (e.g. on a developer's machine, or on a build server). As a result, the compromised version was signed by exactly the same processes as would be used by the uncompromised version. The flaw was introduced before the signing of the binary took place.

Similarly, a checksum would have been calculated based on the results of the compilation, by which point, the malicious code was already present.

This is a weak point in all signing architectures - if the process before the signature is compromised, there is no real way to detect it. It doesn't mean they're unhelpful - if the attackers didn't get access to the systems until after the signature had been applied, the tampering would have been detected easily, since the signature wouldn't have matched.

Matthew
  • 27,263
  • 7
  • 89
  • 101
  • 22
    This is not a weak point of signing architecture, the signing architecture is good and work as intended : it tells you that the binary comes from a defined company. – Guillaume Sep 18 '17 at 14:17
  • 21
    @Guillaume Perhaps better to say a "limitation of all signing architectures" – Ben Aaronson Sep 18 '17 at 14:54
  • 33
    For a signature to do much good, you must be able to trust the signer completely. In this case, the signer itself was compromised. +1 – jpmc26 Sep 18 '17 at 20:49
  • Are there any good cross-platform build systems which are designed to ensure that identical source files will yield identical executables, regardless of the platform used for the build? I would think that building a program twice on two independent platforms, and having the machine that signs the code receive a copy built on one machine, and having the machine that distributes the code receive a copy built on the other, would greatly reduce the risks of undetected infection on a build machine. – supercat Sep 19 '17 at 14:46
  • 5
    @Guillaume I think the terminology is appropriate. Yes, the signing architecture is good and works as intended; nevertheless, there are attack vectors that it can't protect against. Those are, by definition, weak points of the signing architecture. – David Z Sep 19 '17 at 15:34
  • "Limitation" may or may not be slightly better wording, now that it's actually been mentioned; but "weak point" is still pretty valid, since that only really means "weakness", not "fault" or "error". – Panzercrisis Sep 19 '17 at 18:07
  • @supercat Comments are not really the appropriate place for a followup question, instead ask a new question (if you can get it focused enough & on-topic here). But getting reproducible builds on identical toolchains is hard enough, different (let alone independent) toolchains would be nearly impossible. – derobert Sep 20 '17 at 07:41
  • @StekDobbel This isn't about "the end user's options", or "blame", it's about "can we design a system that protects against certain attacks". The signing architecture is intended to protect against insertion of malicious code, but has a weakness that you have to trust the signing server and its input. Since this is an inevitable weak point of the architecture, acknowledging it actually *reduces* blame on the end user: this was not a case of improperly implementing that security measure (although they may well have improperly implemented something else which let the attacker modify the build.) – IMSoP Sep 20 '17 at 10:59
4

Being signed by a trusted cert and having a public hash/checksum of the code are different things.

The cert will (should) tell you the software is from a trusted source but that is it.

A hash or checksum will let you verify the binary matches the originally computed hash. In this case however the hash was computed while the malicious code was present in the source code, rendering this particular safeguard unless, if anything it adds a false sense of security.

When offering a public hash or checksum (assuming the source code has not been compromised), you should take measures to have that hash served from a 3rd party domain. Imagine a situation where an attacker compromises your site, swaps a genuine binary for a malicious one, then changes the public hash you are displaying on your site. At least if it is served from a 3rd party the attacker would also have to compromise that 3rd party to change the hash.

Remember though this particular attack was apparently the consequence of a hacked developer machine, so none of these hash/checksum or code signing controls would have been any use anyway.

TrickyDupes
  • 2,829
  • 1
  • 13
  • 27
  • 1
    This helpful question describes hashes and why they don't usually provide any security: https://security.stackexchange.com/questions/33154/what-security-purpose-do-hashes-of-files-serve – Cody P Sep 18 '17 at 21:40
  • 2
    The compromise happened *before* the software was published. Thus, even an entirely intact hash of the software as it existed at time of publication would have done no good at all. – Charles Duffy Sep 18 '17 at 21:43
  • 1
    The distinction between a hash and a signature isn't really relevant here - both are a record of some party (the holder of the private key, or the host of the site listing the hash) receiving a copy of the installer which they believed to be "correct". The only real difference is that with a hash you trust the medium by which the hash is transmitted to you (which probably in turn relies on trusting a particular SSL certificate), and with a certificate, you trust the medium by which some public key is transmitted to you. Both are useless if the signer was tricked into signing compromised code. – IMSoP Sep 19 '17 at 10:26
  • Hence this comment "This particular attack was apparently the consequence of a hacked developer machine, so none of these technical controls would have been any use anyway." – TrickyDupes Sep 19 '17 at 10:27
  • But in that case, what's the point of the answer? How does an attack that wouldn't have been prevented by any kind of hash or signature "highlight the importance of offering a proper hash sig"? – IMSoP Sep 19 '17 at 10:36
  • That is valid. I will edit the context. ;-) – TrickyDupes Sep 19 '17 at 10:42
  • 1
    "_The cert will (should) tell you the software is from a trusted source but that is it_" It also tells you that it hasn't been tampered with since signing (although in this case it appears the tampering happened _before_ signing). – TripeHound Sep 19 '17 at 11:27