Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The fact that 3 is controversial is telling of the sad state of the security knowledge of techies generally. The most people seem to be able to do is to cargo cult / parrot, misunderstand, and misappropriate quips like “security by obscurity bad!” when it, all else equal, is a perfectly reasonable and often useful additive measure to take if it’s available to you.


A knee-jerk aversion to anything halfway adjacent to "security by obscurity" is flawed, but this reaction to that aversion is also flawed.

Instead of trying to suggest "security by obscurity is fine, actually, and don't worry about it", it's time for us to just stop being pithy and start being precise: your cryptosystem should be secure even if your adversaries understand everything about it. If that is true, then you can (and, in the real world, almost certainly should) add defense in depth by adding layers of obscurity, but not before.


While “security by obscurity” may be good for some spy agency as an additional layer over a system that would remain secure even if it were published, most people are right to say that “security by obscurity bad!”, based on the known history of such systems.

The reason is that, without any exception, every time when some system that used “security by obscurity” has been reverse engineered, regardless if it was used for police communications, mobile phone communications, supposedly secure CPUs etc. it was discovered that those systems have been designed by incompetent amateurs or perhaps by competent but malevolent professionals, so that those systems could be easily broken by those who knew how they worked.

“Security by obscurity” is fine for secret organizations, but for any commercial devices that incorporate functions that must be secure it is stupid for a buyer to accept any kind of “security by obscurity”, because that is pretty much guaranteed to be a scam, regardless how big and important the seller company is.

Obscurity is OK only when it is added by the owner of the devices, over a system that is well known and which has been analyzed publicly.


> The most people seem to be able to do is to cargo cult / parrot, misunderstand, and misappropriate quips like “security by obscurity bad!"

That is the point. It is a good rule of thumb for people who don't know much about security. Anything they create trying to add more security to their system is more likely to do the opposite.

If you think you know better, feel free to ignore it. Just be aware you wouldn't be the first who thought they knew what they were doing or even the first who did know, yet still messed up.


This misunderstands how "security by obscurity" came about, because there are good and bad types of obscurity. Back in the 1800s people were selling shoddy locks that were easy to pick and they were mad that people were disclosing lock picking methods: https://www.okta.com/identity-101/security-through-obscurity...

This history repeated later, with people making shoddy cryptography where they didn't want anyone to know how it worked, and similar things, most of which got broken in embarrassing ways. This sort of obscurity was actively harmful and let people sell defective products that people relied upon to their detriment.

Meanwhile, there are good types of obscurity, too. For example, there are the information disclosure CWEs that tell users of products not to disclose version numbers, stack traces, etc. to users, and this sort of "obscurity" is perfectly reasonable and widely accepted.

So it's not the case that all things that might be termed "obscurity" are bad.


You can even poke holes in it using their own terminology. Obscurity is equivalent to minimizing attack surface area. The less adversaries know about your system the smaller of a target it is.


I think there's overlap between surface area and obscurity, but they're not equivalent. To use the most pedestrian example, moving SSH off of port 22 makes it more obscure, but the total surface area hasn't gotten smaller.


Anecdata: log files with failed login attempts became far smaller after leaving port 22.


Yeah. I think it's the result of conflating theoretical cryptography and practical IT security. Kerckhoffs's principle is true in the theoretical domain and it's certainly important that the designers of standardised crypto algorithms adhere to it but it doesn't follow that it's pointless to change your SSH port.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: