Table of Contents
- What is a Security Thought Leader - Updated November 18th, 2009
- Framework for Security Thought Leader Interview - August 26th, 2009
- Daniel B. Cid, Sucuri - November 21st, 2013
- Dominique Karg, AlienVault - November 20th, 2013
- Lance Spitzner, Securing The Human, founder - Updated November 29th, 2012
- Bill Pfeifer, Juniper Networks - March 4th, 2011
- Chris Pogue, Senior Security Analyst - July 8th, 2010
- John Kanen Flowers - May 26th, 2010
- Kees Leune, Leune Consultancy, LLC - February 13th, 2010
- Joel Yonts, CISO - February 12th, 2010
- Maury Shenk, TMT Advisor, Steptoe & Johnson - January 31st, 2010
- Chris Wysopal, CTO, Veracode - January 27th, 2010
- Amir Ben-Efraim, CEO, Altor Networks - November 25th, 2009
- Ed Hammersla, COO, Trusted Computer Solutions - Updated November 19th, 2009
- Amit Klein, CTO, Trusteer - September 27th, 2009
- An Interview with Ron Gula from Tenable about the role of a vulnerability scanner in protecting sensitive information - Updated August 13th, 2009
- A. N. Ananth, CEO, Prism Microsystems, Inc. - August 7th, 2009
- Jeremiah Grossman, Founder and CTO of WhiteHat Security - Updated April 24th, 2009
- Mike Yaffe, Director of Product Marketing, Core Security Technologies. - April 15th, 2009
- Chris Petersen, Chief Technology Officer, LogRhythm - March 13th, 2009
- John Pirc, IBM, ISS Product Line & Services Executive: Security and Intelligent Network - February 17th, 2009
- Leigh Purdie, InterSect Alliance, co-founder of Snare: Evolution of log analysis - January 28th, 2009
- Bill Worley, Chief Technology Officer, Secure64 Software Corporation - December 9th, 2008
- Doug Brown, former Manager of Security Resources, University of North Carolina at Chapel Hill - October 30th, 2008
- Amrit Williams, Chief Technology Officer, BigFix - June 30th, 2008
- Andrew Hay, Q1 Labs - May 13th, 2008
- Gene Schultz, CTO of High Tower - April 4th, 2008
- Tomasz Kojm, original author of ClamAV - April 3rd, 2008
- Bill Johnson, CEO TDI - April 2nd, 2008
- Gene Kim, Tripwire - March 14th, 2008
- Kevin Kenan, Managing Director, K2 Digital Defense - March 14th, 2008
- Leigh Purdie, InterSect Alliance, co-founder of Snare - March 7th, 2008
- Marty Roesch, Sourcefire CEO and Snort creator - February 26th, 2008
- Dr. Anton Chuvakin, Chief Logging Evangelist with LogLogic - January 28th, 2008
- Kishore Kumar, CEO of Pari Networks - Updated January 28th, 2008
- Interview with Dr. Robert Arn, CTO of Itiva - November 1st, 2007
- Interview with Charles Edge - September 15th, 2007
- Ivan Arce, CTO of Core Security Technologies - Updated May 6th, 2009
- Mike Weider, CTO for Watchfire - Updated July 23rd, 2007
- Interview with authors of The Art of Software Security Assessment - Updated July 9th, 2007
- Ryan Barnett, Director of Application Security Training at Breach Security, Inc. - June 29th, 2007
- Dinis Cruz, Director of Advanced Technology, Ounce Labs - June 11th, 2007
- Brian Chess, Chief Scientist for Fortify Software - June 9th, 2007
- Caleb Sima, CTO for SPI Dynamics - Updated May 29th, 2007
- An Interview with David Hoelzer, author of DAD, a log aggregator - May 1st, 2007
Gene Kim, TripwireStephen Northcutt - March 14th, 2008
Gene Kim is one of the original authors of Tripwire, a software product used to manage configurations and change. Gene is willing to share his thoughts on virtualization with the Security Laboratory thought leadership series, and we certainly thank him for his time!
Gene, let's start with change, how important is change management?
Stephen, every day, as information security practitioners, we live with the reality that they are a single change away from a security breach that could result in front page news, brand damage, or regulatory fines. These issues are clearly not confined to security, but they impact business at the highest level. Consequently, security practitioners strive to implement IT controls to mitigate the risk of fraud, loss of confidential customer information, disruption of critical business services and data integrity, inaccurate financial reporting, and the list goes on.
Change also creates risk from an operational perspective - every IT organization lives with the daily reality that they’re always one change away from an outage, a catastrophic episode of unplanned work, or something that causes audit or security issues. All of which jeopardizes the completion of planned work, which is what they’re supposed to be working on.
But certainly we can't stop change, Gene.
So true - in fact, it seems like it’s just part of the human condition: change happens, and the pace always seems to be get faster and faster.
This has some serious security implications, though. The need to respond quickly to urgent business needs makes it more and more difficult to effectively balance risk and controls. Most business functions now require IT in order to conduct operations. In fact, almost every business decision requires at least one change by IT - a trend that continues to grow.
So how does virtualization factor into this discussion? It is clearly hot, according to an article on SecurityFocus, "Intel and AMD are building support for virtualization into their CPUs to make the technology easier to implement and faster to run." What is the driver for it, being green?
That’s definitely part of the reason, Stephen. Other reasons are the need for increased agility and the ever increasing cost and complexity of IT. All of these have contributed to the rapid adoption of virtualization technologies.
Virtualization makes it possible to build and deploy IT releases and changes into production faster and more economically than ever before.
So it is a brave new world, what does it mean for security? Is virtualization going to be an agent for or against security? An Infoworld article by Tom Yeger points out, "Multiple virtual machines sharing one physical system are likely to use a sequential range of IP addresses, and they often have identical local administrator passwords. Crack one, and you’ve cracked all servers with similar characteristics." What is your sense on the security of virtual machines?
Some virtualization experts claim that virtualized computing environments are fundamentally no less secure than physical computing environments. Others claim that virtualization can enable better security. Both of these claims can be correct, but only under certain conditions.
The reality is that when information security controls are improperly implemented or neglected in virtualized environments, real security risks and exposures are created faster than ever.
So this is one of those Age of Speed situations?
Haha. I think the fear is that virtualization can create a scenario of Unsafe at Any Speed: the Designed-In Dangers of the American Automobile by Ralph Nader. Of course, I’m exaggerating to make a point. Virtualization can be very secure or very insecure. But what is definitely true is that what was safe at 60 miles per hour may not be safe at 200 miles per hour, which is the faster pace that virtualization enables. And this is the potential dark side of virtualization: that the information security controls that adequately controlled risks before virtualization may no longer suffice.
Virtualization enables rapid deployment, potentially allowing insecure IT infrastructure to be deployed throughout the organization faster than ever. The unfortunate truth is that the people who deploy this infrastructure often circumvent existing security and compliance controls when doing so. Unfortunately, the risk these deployments introduce is only discovered when a security breach occurs, an audit finding is made, or the organization loses confidential data or critical functionality.
How popular is virtualization, how many organizations are already using it?
For better or for worse, virtualization is here. Tripwire surveyed 219 IT organizations and found that 85 percent were already using virtualization, with half of the remaining organizations planning to use virtualization in the near future. Furthermore, VMware found that 85 percent of their customers are using virtualization for mission-critical production services. In other words, inadequate information security controls may already be jeopardizing critical IT services with risk introduced by virtualization.
There seem to be two keys to information assurance, to configure systems properly in the first place and to detect anomalous traffic. How important is configuration in the virtual world?
Most information security practitioners now attribute the majority of security failures to misconfiguration resulting from human error. According to Gartner, "the security issues related to vulnerability and configuration management get worse, not better, when virtualized." Also, according to Gartner, "Like their physical counterparts, most security vulnerabilities will be introduced through misconfiguration and mismanagement."
Why? Among other reasons, insecure virtual server images can be replicated far more easily than before, and once deployed, require great effort to discover and bring back to a known and trusted state. Analysts have published some startling predictions on these information security implications: Gartner predicts that "Through 2009, 60 percent of production VMs will be less secure than their physical counterparts" and that "30 percent of deployments [will be associated] with a VM-related security incident."
The good news is that it doesn’t have to be this way.
Where do people make their big mistakes?
The security risks occur primarily at two levels: at the virtual machine manager (VMM) layer where the host OS resides, and at the virtual machine instance layer where the guest OSes reside. Misconfiguration can occur in both layers that would allow security risks to be uncontrolled and unmitigated.
The fact is that when done manually, setting configurations properly is tedious, slow and error-prone. As information security practitioners, our goal should be to ensure that all configuration settings at the VMM and guest OS layers are properly defined, implemented and verified. There’s already lots of great guidance on how to do this from respected third parties and vendors, including Center for Internet Security, VMware, and so forth.
And of course, because we are appropriately paranoid, we must "trust, but verify." This is where you need automated tools to help achieve and maintain known and trusted states, so you can find variance and quickly fix it. Information security will own parts of these settings, but where they don’t, they need to hold the relevant parties accountable for ensuring that their portions of the infrastructure are locked down (e.g., VMM manager, servers, networks, databases, applications, etc.)
Incidentally, information security can’t do any of this they’re not aware that virtualization is being used. This requires some situational awareness, so some sleuthing around may be required to even find out where virtualization is being used, and by whom.
Thanks for taking the time to share with us and contributing to the thought leadership series on the security laboratory Gene and congratulations of the birth of your first son Reid, he looks just like you, though a tad smaller! I can't wait to see how long it takes till you give him his first composition book!
5. Gartner, Inc. Security Considerations and Best Practices for Securing Virtual Machines by Neil MacDonald, March 2007.
6. Same as  above.