guy on lock files

Editor's Note: This is a guest Blog Post from Dan deBeaubien.   Below is a description of his upcoming talk on "Measuring Human Risk - What is Your Security Score" at the Security Awareness Summit 10 Sep in Dallas.

Assuming that we know what to do in a given circumstance related to cyber security - install a firewall, do an audit, train our staff, whatever, and, also assuming that many resources abound to address these situations as they arise, the emergent issue is often where to start. We can’t do everything, everywhere - we need to know where to begin, and where to go next. In my role at Michigan Tech, and working closely with Ashley Sudderth the University’s Chief Information Compliance Officer, we set about to come up with a plan which would continuously and methodically attempt to close one of the largest security gaps we saw; “the Human Problem”. Let me take a step back. It isn’t that we didn’t have plenty of technology type problems - far from it; cyber security related spending was still at an all-time high and we continued to be challenged with everything from DoS to APT. It was a balance issue, we were effectively playing whack-a-mole with cyber security technology and holding our own, but, meanwhile, the senior staff became very aware that the largest threat was clearly “human” in  nature.  

Whether it was a well placed phishing attack, cell phone loss, or mistyped email address was the particular concern, understanding our end-user information security behavior at the became the first order of business. Another point worth making; we had a successful human-security training and response program, broad based, content rich, and widely taken. So what’s the problem? There are a few:

  •  Over Training: To much per-user content leading to “training fatigue”.
  • Unfocused Training: Over training in areas not relevant to our people’s data access or position.
  • Insufficient Response: We tended to treat many situations, regardless of risk, the same.
  • Rumor Based Actions: When we did focus, we did it with little data and few facts. Ironically, we often focused training efforts in areas (like credit card handlers) who were the best trained, often because we had a sort of breach-fear about this type of data.

We set out to establish new goals for the program, we clearly needed data to drive our actions, and we wanted it to be related to actual real risk; what data we had was about who finished training. We started gathering actual end user behavior data by conducting simple annual surveys - surveys which were designed to profile information access by our employees; what do they handle, where is it kept, etc. With an average completion time < 3 minutes, we begun using the data to create scores for our employees, departments or divisions. In turn, the scores can inform responses, from training topics, to audits, to process change. Trending this data over a few training cycles also produces insights into how our security awareness program, audits, in-person training changes our risk profile. It addresses many of the training concerns by allowing us to eliminate unnecessary training and tailor our response to the situation “on the ground”; our actions are now guided by risk not rumor. If you want to know more about this process, take a peek at the data or the multi-year trends, check out my talk at the SANS Security Awareness Summit in Dallas this September. I’ll be presenting with the programs co-designer, and my good friend, Ashley Sudderth, Michigan Tech’s Chief Information Compliance Officer. BIO: Dan deBeaubien is a 25-year veteran of Information Technology and former CTO of Michigan Technological University.  He has held a variety of posts at the throughout his career including Sr. Systems Administrator, Sr. Telecommunications Engineer and Director of Information Technology Services and Security. Before joining the SANS team, Dan created Michigan Tech’s Information Security Office and the positions of Chief Information Security Officer and most recently Chief Information Compliance Officer.