Security Annoyances

[Get only posts in English]

Stuart King has blogged a list of his top 5 information security annoyances:

  1. Security awareness programs
  2. Compliance = security
  3. Risk modelling
  4. Where are all the analysts
  5. It’s not my fault

By and large I agree with his list, not least because he seems to have annoyed a few of those who are overconfident about risk trivia and business school quadrant diagrams.

I’d like to add to the list two of my own favorite annoyances:

  1. Just make it hard – for the legitimate users and uses of a system. Attempts to improve information security often make it harder for the users of a system, for the employees of an organization to do their legitimate jobs. Sometimes this is unavoidable, we all know there is often a tradeoff betwen usability and security. The tradeoff turns into a fallacy where the primary impact of security measures is reduced usability while actual security remains more or less the same.
  2. Alice’n’Bob thinking. Academic researchers might be particularly prone to this: thinking and arguing in the Alice’n’Bob world of security textbooks as if it was a suitable model of the real world and the real security issues. It isn’t, which we somethimes forget when we name abstract entities after humans.

6 thoughts on “Security Annoyances

  1. My annoyance with King’s post was that he was broad-stroking security awareness programs and risk models as ineffective – without offering even an idea of recourse. As to your assertion that I am overconfident of ‘risk trivia and business school quadrant diagrams’ – call it what you will. I do not consider risk analysis a trivial subject (especially given our current economic climate) – nor are the simple models I posted not used extensively. Like King – you also have not offered any solutions to the annoyances he listed nor your own.

    1. Risk analysis isn’t trivial, that’s the point, but our techniques are. This is not the fault of those applying them – if they know and accept the limitations of their tools. And limitations there are many, the two most crucial being lack of useful data and lack of understanding how risk factors are interrelated. If we are honest we’ll have to admit we are making educated guesses but we don’t know to any serious degree of precision whether and how our security measures really modify the risk landscape.

      I would be happy to be proven wrong, by the way.

  2. @ Chris: Reread the text.

    #1 He is not saying that all security awareness is bad – only the ones that don’t work. Though I assume that’s the majority. Most concepts are copied and depend on reading and clicking rather than thinking and applying. My proposal – make security awareness fun. Start devoping security-awareness games, and I don’t mean “Shoot the hacker”.

    #3 It’s not about my riskmodelling-scheme is better than yours. It’s about, yes there is a lot of bogus on behalf of risk modelling where a little bit of common sense wouldn’t hurt. But I would not condemn all risk modelling, there are situations that are too complex for the human mind. Only I sometimes encounter situations, where risk modelling is useless because the enterprise really has other problems.

    #4 I don’t know. I feel there are a lot of good guys out there, but that’s just me.

    #5 That’s not security specific.

    As to #6&7, I think ease of use really is king. At least the opposite puts me off, most of the time.

  3. “risk trivia” that’s good. Say, I’m working for with a Fortune 100 right now, and they have 66 projects that they’re thinking of doing. Sadly, they can only do between 20-40 of them.

    I was wondering, without using “risk trivia” or “business school quadrant diagrams” do you have some sort of process you could suggest they use to prioritize those projects?

    1. I was wondering, without using “risk trivia” or “business school quadrant diagrams” do you have some sort of process you could suggest they use to prioritize those projects?

      That’s a valid question and I’m afraid I don’t have an answer to it. Nor does anybody else, I believe. I respect any attempt to base such decisions upon a sound understanding of the problem space and a systematic approach of whichever kind. My impression is, however, that we haven’t found methods so far that would be much better than using your guts and common sense. Do we really need quadrants telling us it would be a good idea to avoid frequent risks of high damages? My personal litmus test for risk analyses is this: can it provide me with justifications for not doing anything where I don’t have to do anything?

  4. “Nor does anybody else” – based on your litmus test, and qualitative requirements expressed – I think you’ve got a small sample size/lack of exposure.

Comments are closed.