One of key things I feel that is holding back the adoption of awareness training is viable metrics demonstrating that it works.  When you patch a computer, you can actually see it stops an exploit. When you deploy a firewall, you can see it blocking un-authorized traffic.  However when you train employees it can be hard to demonstrate the value of that training.  This is why I feel metrics are so important to developing awareness.  In this post I want to focus on what I consider are good security awareness metrics.

In general I feel there are two types of awareness metrics, those that measure progress and those that measure impact.  When people discuss metrics for security awareness they usually focus on the progress of their program. Metrics such as how many employees took the training, how many employees passed the quiz, how many employees read the monthly newsletter, etc.  These metrics are great for compliance purposes, they document that people are taking the training.  These are also the easiest metrics to obtain. However they do not demonstrate what type of impact your training is having, are we making a difference?  This is the purpose of the second category of metrics, metrics that measure impact.  This is what I want to focus on.

Now that we have determined our goal is metrics that measure impact, what defines good metrics?  When asked such questions I always turn to Andrew Jaquith's book Security Metrics, the bible when trying to measure the impact of security.  He brings up several excellent points on what makes a good metric.

  • The metric has to be measured in a consistent way. Different people should be able to apply the same method to the same data set and come up with equivalent answers.  We cannot depend on the subjective judgments or opinions of people. To be honest, this is why the metric known as 'risk' is often such a bad metric, it is too open to interpretation of who is measuring it.
  • We want a value that is a number or percentage.  Metrics such as high, medium and low are vague and open to interpretation.  We need something like 33% or  105 employees per month.
  • The metric has to be easy and cheap.  Can your team easily measure the metric every week?  A metric has little value if it requires too much work or cost too much to repeat.
  • The metric has to be relevant, it has to be something you can act upon.  The classic example of an irrelevalent metric is the top ten list of countires that host spam.  Wonderful, the United States is the number one country for spam.  So what do you do, block every IP in the country?  We want metrics we can do something about.

In my next post I'll give examples of good metrics based on our definition of the above and how you can use them.