10

We can devise the best technical solution to a security problem, but that solution needs to be applied by people, by a business, to an organization, and/or in spite of objections.

The barriers people have to implementing security interests me greatly because it affects me directly. I know others have asked about "how to get buy-in" and "good ways to educate" but my question is about the other side of the equation.

I want to hear the specific reasons you have heard for a user, a manager, or a company user for not wanting to implement a more secure policy/procedure/product. The more specific the better.

Maybe if we better understand the objections to what we do, we can be better at working with those we seek to serve.

schroeder
  • 125,553
  • 55
  • 289
  • 326

7 Answers7

9

I discovered a vulnerability in one client's infrastructure which could cause an outage plus data loss. When pressed for a 1-off cost calculation the client agreed it could cost them upwards of £1 million each time, so I proposed a remediation plan which would have cost under £100k.

They turned it down, as £1 million was below their lower risk threshold - they had bigger risks to set their teams on.

But a valid risk decision by the business is a correct answer to go back to a security team with, so this was fine.

Rory Alsop
  • 61,474
  • 12
  • 117
  • 321
  • It becomes another story when the vulnerability was your fault to begin with. Or were you planting bugs for job security, hmm... :) In all seriousness though, cost/risk is certainly a valid reason to not implement additional security. – Wesley Murch Jan 26 '12 at 19:42
  • I completely agree with the business decision as a valid risk calculation and we can't overlook or underestimate that! Thanks for highlighting that idea. – schroeder Jan 26 '12 at 21:03
6

By far the most common excuse I get so far is the age old "We'll get around to it, for now just get it working".

Various reasons for this mentality include:

  • Urgent project: a barely-polished prototype is all they need to get operations running, the rest is perceived as 'non-revenue generating' and thus takes a back seat. The hustle and bustle of the workplace means this crucial work gets put further and further down the queue until it's simply forgotten about.
  • Low funds: Similar to above, time is money and the firm may be short on cash to pay developers.
  • Ignorance: It still amazes me stumbling across PHP developers that have been in the industry for years who don't know about concepts such as CSRF attacks, password salts, prepared statements and hashing algorithms beyond MD5. It's not that they're lazy, they're actually rather brilliant. They just simply don't know, and it's not until you're breached that you ever come across the need for it on your own.
  • Arrogance: Most people also tend to take a "I'm just a small company, why would someone hack us?" mentality, believing they're too small to be worth a hacker's time. This perception of hackers as this 'all mighty, all knowing god of computing' leads people to believe they're far too busy hacking the likes of Sony and the White House to be bothered with Bob's Discount Car Parts. Truth is, Bob's Discount Car Parts might be a pinch to take down, might fall to automated penetration tests or even a rouge competitor (trust me, I've seen it happen with a similar business!). Behind the scenes is all sorts of tasty goodies: CC numbers, email addresses, personal information, free product orders...
  • Lack of audits: Most developers (even myself!) think they're hotshots at security, but we simply can't tell until it falls. One way to mitigate this is through security audits by independent contractors, but they're expensive, time consuming, 'unnecessary'... these people know how to think like hackers, whereas your job is to think like a software developer.
  • Developer Pride/Ego: Don't laugh, I've seen it happen. Some developers are just so high and mighty about themselves that they refuse to implement security because either they can simply beat them through other means ("Dude, I'll go to their house and smash them if they hack me"), or they refuse to have someone come along and potentially tell them they're any less than god's gift to the world of IT (related to 'Lack of audits' above).
TC Fox
  • 535
  • 2
  • 8
  • Great response. You mention the need for education and an external perspective as 2 things that people who need security actually need. Any ideas on how to offer that to others? – schroeder Jan 26 '12 at 21:16
4

I want to hear the specific reasons you have heard for a user, a manager, or a company user for not wanting to implement a more secure policy/procedure/product.

This is what I always get:

We've never had a problem so we don't need it.

I have a hard time providing an argument against that, as ignorant as it may be.

I guess the only thing you can do is prove that the system is vulnerable, which can be difficult. Sometimes though, they're right - the system is relatively secure and additional measures are not necessary or more importantly (to them), worth the cost of implementing. In that case, you'd have to prove that it's worth their dollar, which once again, can be difficult.

Is this the "best" reason? Maybe not, but it's certainly common.

Wesley Murch
  • 212
  • 2
  • 8
  • 3
    Challenge to go back with: that's what Sony, RSA, etc said, and look how much it cost them! – Rory Alsop Jan 26 '12 at 19:11
  • Still it comes down to money. The cost of additional security must be weighed against the cost *and* probability of a successful attack. Like the other answer says, the data may not be all that valuable, therefore the probability of an attack may be slim. – Wesley Murch Jan 26 '12 at 19:32
  • Aye - see my answer for an example. – Rory Alsop Jan 26 '12 at 19:37
  • This and variants of this are common for me, too. Helping them with risk calculations is important, but rarely fruitful in my experience. – schroeder Jan 26 '12 at 21:04
3

Lets take the specific example of passwords. Security best practice would have us use passwords of a minimum length, contain letters, numbers, special characters etc and not be easy to guess. Further they should be unique to a particular site and changes every 30 or 90 days. Finally if the password is ever divulged or compromised that should be immediately communicated to the security officer and the password changed immediately. Oh and don't ever write your password down.

That's great advice from a security point of view, but virtually unworkable in the real world. Can you remember that many passwords? I literally have 100s of them.

Further have you ever had the situation where you were distracted or in a hurry and typed the password for site X into the login box for website Y? I bet if people were honest you would find this happens all the time. Did you immediately change the passwords on both sites?

JonnyBoats
  • 1,143
  • 7
  • 8
  • This is why KeePass and PassSafe have become so popular - I have hundreds of long passwords in mine. – Rory Alsop Jan 26 '12 at 19:12
  • Rory: This may be fine for low security, but I have worked in secure locations where bringing in a thumb drive (or even a smartphone) was a security violation. – JonnyBoats Jan 26 '12 at 19:15
  • @RoryAlsop I use Lastpass for the same reason (as a paying customer). I also believe that this makes me _more_ secure as I can use very secure passwords without having to remember them. – Mei Jan 26 '12 at 19:18
  • @JonnyBoats - usually in those environments we get specific rules on how to remember the passwords, whether they are written, stored in pieces, in safes or whatever – Rory Alsop Jan 26 '12 at 19:33
2

The system or data may not have much value, and the cost of a breach could be low enough to justify a relative lack of security. For example, I don't always use strong passwords for web-based forums, because the site doesn't store my name or any personal information and I don't have any reputation to tarnish.

Jonathan
  • 209
  • 1
  • 3
  • However, one system can be used to get into another - either through trust relationships or through other hacks. As far as a reputation goes, that may be true now - but later... – Mei Jan 26 '12 at 19:16
1

Most folks resist change of all types and accordingly will come up with endless logical fallacies to fight off the implementation of something new.

  • that auditor/inspector/certifier is a nitpicker
  • nothing has happened so far
  • ask anyone, this is overblown

Focusing on specific rationalizations (the focus of your question) missed the real issue – the natural tendency to avoid change. In pushing back against logical fallacies you are dealing with emotions and belief systems. Therefore, the best approach is not to attempt to reason with people presenting logical rationalizations against your proposal but rather to sell the idea on its benefits.

As with any good sales situation you should be presenting something of value to a person that needs it. So consider who directly benefits most from the new security paradigm and sell that person first. There are probably some folks not sleeping so well due to the very issue you are trying to address. Get to them and let them get to others with a proposal based on validated operational value. If you can’t sell it to the folks that are on the hook for consequences of an exploited vulnerability then the organization has spoken and it’s a no go.

zedman9991
  • 3,377
  • 15
  • 22
  • The 'real issue' IS the whole point of my question :) I am not sure if I have a clear view on what the real issue is, and I'm not sure that it's change aversion. By collecting the experiences of others we might be surprised by the common threads we see. For instance, there seems to be a common thread of 'risk calculation' shaping up. As for emotions and beliefs, we MOST DEFINITELY need to address them, or else the benefits will be overshadowed. – schroeder Jan 26 '12 at 21:10
  • "ask anyone, this is overblown" is my fav – schroeder Jan 26 '12 at 21:18
0

1/ Seen as too restrictive: In web applications, security can often be perceived as restricting interoperability. This often occurs in systems where 3rd party plugins/addons are allowed to interact with content management systems ( osCommerce, Joomla and Wordpress to name a few ). With each shortcut they have paid heavily, and Joomla and Wordpress continue to pay the price.

2/ Good security can go out the window when code is stored in secure directories. For example depending on htaccess basic authentication to protect directories can cause developers to drop their standards in the realm of security. This also is the same when systems are developed to work in a local area network, there also, is a false belief that security is not as critical therefore standards can be more laxed.

Taipo
  • 179
  • 4