Teaching Sec503, our intrusion detection class last week, we yet again wrote a signature for a CVS exploit from a few years back. Sure, it is kind of old news by now. But I think it is is very timely if you are concerned about the integrity of your software. If you are not familiar with it: CVS is software used to manage source code repositories. A compromise of your CVS server means that you can no longer trust the software maintained by this server. Hardly anybody installs software from a CD anymore. Most software today is downloaded and installed. But how do you know if the download has been tampered with?
There are two main methods to verify file (and with that software) integrity: Cryptographic hashes and digital signatures. Cryptographic hashes are essentially fancy checksums. Unlike a checksum, it is very hard to find two documents with the same hash. An attacker will not be able to for example add a backdoor to your code, and then add additional comments or empty code to make the hash fit. This may no longer be fully true for MD5, but even MD5 is still "pretty good" and exploits at this point only work if certain conditions are met.
But there is a fundamental weakness that comes with using a hash to ensure file integrity: How do you make sure that the hash didn't get changed? In many cases, the hash is stored in the same directory, with the same permissions, as the original code. An attacker could just replace the code, and the software using the same attack.
This problem is solved in part by using digital signatures. A digital signature starts out with a cryptographic hash of the software, but this hash is now signed using a private key. Nobody needs to know this private key, and it should be kept tugged away and out of reach. An attacker will now no longer be able to modify the hash undetected. However, in order to verify the signature, a user will need a copy of the public key. How do we make sure the public key is correct? This may be easy if you already have the public key, or are able to validate it out of band. But in many cases, you obtain the public key at the same time you obtain the signature and the code.
One solution to this problem is the hierarchical system implemented by SSL code signatures. In this case, the code is signed by a public key which in itself has been signed by a trusted entity. You operating system will usually trust a number of these certificate authorities by default and then trust every key signed by one of these trusted certificate authorities. This works very well, as long as these certificate authorities are careful in how they hand out these signed certificates. Sadly, bad certificates have been handed out in the past.
So we got our software. We verified that the signature is correct and installed it. We are not in the clear yet. Next thing you want to do is either download additional components or updates. In particular updates are typically verified by the application itself. There are a number of pitfalls that can cause problems:
- the application validates the signature, but does not ensure that the signature was created using a valid certificate.
- the application allows downgrades. An attacker can offer an older version (and the attacker of course has the valid signature for it) and then have the application downgraded so an old vulnerability can be exploited.
- a badly implemented update mechanism can lead to a DoS issue if the upgrade fails half way and does not provide for a simple "undo".
So in short, here a quick checklist on what to look for:
- if you use regular hashes, store them on a different system then the original software, and secure them well
- try not to use MD5. SHA256 is probably the best algorithm to use at this point. Offer multiple hashes if you can.
- if at all possible, use proper code signing certificates.