Computer security is strongest when everyone knows how it works. This seems on first blush to be counterintuitive — if you're sending some information to another computer or storing it in a file and you want to encrypt it, wouldn't it be better if nobody knew anything about your encryption algorithm? If it's secret and nobody knows how it works, they can't break it, right? Well, this is true, for the most part, but there's a big problem with this so-called "security through obscurity" — once someone does figure out how it works, your secret is out and you've lost at least some (and usually most) of the security that it provides. On the other hand, if your algorithm is freely available and fully documented (i.e. TLS, HTTPS, AES), then the open source community can look it over and find any vulnerabilities in it. If thousands of hackers know exactly how an algorithm works and can still pound on it for years and not break it, that's a good indication that it's pretty strong.
This was demonstrated recently with some Adobe products after the release of TrueCrypt 5.0 full disk encryption software. TrueCrypt rewrites the first sector of your disk with its own software, so that it can ask you for your encryption phrase before the operating system boots. Some users have found that after they did this, their Adobe Dreamweaver installation, which worked perfectly until then, suddenly decided that it was no longer licensed. This is certainly inconvenient, but if they relicensed their Dreamweaver software and rebooted their machine, the machine would refuse to boot, which is beyond inconvenient. TrueCrypt forces users to create a recovery CD before it encrypts the disk, which is lucky since this is the only way to recover from this.
It turns out that Adobe is saving some of their licensing information on the first sector of the disk, which is not accessible through normal channels (i.e. you can search your C: drive to your heart's content and you will never find their licensing data). I'm sure their developers thought this was quite the clever little solution — since nobody could find this information, nobody could modify it, thereby making their software more difficult to pirate. Rather than simply encrypting the licensing information with a secure algorithm, they chose security through obscurity. Now that their secret has been revealed, everyone knows where they store their license information, and so they've completely lost the security that this provided. If they had simply used something like an AES-encrypted file, their licensing scheme would only broken if the AES algorithm itself was broken, which is rather unlikely.
However, adding an encrypted file introduces a key management issue — where do you store the encryption key that unlocks the license file? If it's hard-coded, you're back to security through obscurity — if someone figures out where in the executable the key is stored and posts that information on the internet, your entire security scheme is useless. The solution to this problem is non-trivial, and so I'll leave that up to the software manufacturers. They may decide that a hard-coded key is fine, or maybe using a random key and storing the bytes of the random key mixed in with the encrypted data itself (in a reversible way) is good enough for their purposes. Storing encrypted data along with its encryption key is inherently insecure — like keeping your bank card PIN on a Post-It note attached to your bank card — but it's only licensing information we're talking about here, not the US nuclear launch codes or a file containing credit card numbers or something.
Bottom line: if any part of your security system uses the phrase "Nobody'd ever think of looking here!" or "Nobody will ever figure out how we did this!", it's not secure.
No comments:
Post a Comment