Security through obscurity

Luther Martin asks whether secrecy in security deals is a good thing

Martin: Security through secrecy does not help anyone

In the 1883 Journal des Sciences Militaires, Dutch linguist and cryptographer Auguste Kerckhoffs outlined six principles for a secure communication system. Unbelievably, these principles are still valid – and deserve revisiting.

Kerckhoffs stated his second principle as – roughly translated – secrecy must be not be required, even if that means it can still be used if a system falls into enemy hands.

Even if hackers know everything about your system, if they don't have the cryptographic keys you use to encrypt, they can’t unscramble any encrypted information they harvest.

It should be possible for a hacker to know everything about your security systems and remain unable to defeat them unless he or she knows the secrets you use to identify authorised users.

Those secrets could be encryption keys or other secret information, such as a password. So a good security architecture should always be created under the assumption that attackers will know everything about the architecture.

Assume your attackers know what they're attacking. Security through obscurity is bad, and has been known to be bad for 136 years.

This premise seems to have been forgotten by many corporate IT departments, who all too often require security vendors to agree to draconian terms of confidentiality when selling them security products.

And in the information security market, it benefits neither security vendors nor their customers.

Many security products are what economists call ‘experience goods’, which means you can’t easily judge their quality before you buy them.

Some are even ‘credence goods’, which means you generally can never judge their quality, even after they're used. For example, it is very difficult to tell strong encryption from very weak encryption.

Yet knowing whether products are any good before you buy them would surely help the customer.

In 2001, George Akerloff was awarded the Nobel Prize in Economics for his analysis of what are popularly called ‘lemon markets’: how inability to tell a good product from a poor one means high-quality products can be driven from the market, leaving only low-quality products at relatively high prices.

Withholding information about what security products are used and user experiences of them is also probably a step towards a lemon market. It would be easier for users to avoid buying rubbish.

Although exposing the weaknesses in products will make vendors work harder on creating more robust products and fixing existing problems.

Furthermore, enterprise security products are more expensive than they need to be, partly because of the long sales cycle. This cycle could be shortened if better product information was available. Thus the cost of sales could be much lower.

I don't believe that this secrecy offers any additional security. And restricting the flow of information about the quality of security products simply makes things worse for the producers and consumers of security technologies.

Luther Martin is solution architect at Voltage Security