Friday, March 4, 2016

Differing Cybersecurity Philosophies: Culture versus Control

Originally published at TheCybersecurityPlace.com


In my travels and interacting with different types of security professionals in different organizations and operating environments, I have noticed two “camps” for improving the security posture. Both are centered on the premise that humans are the biggest vulnerabilities to organizations.

The first camp (for the purposes of this article, we can call them the “nurturers”), insists that the only real solution to the human problem is to increase user understanding of the threats, risks, best practices, and policies. By educating the users, the idea is that they will start to care enough about the cybersecurity program in their organizations and make better security decisions.

On the other side of the coin, the folks we will affectionately call the “engineers” like to champion the stance that the best way to address the human problem is to develop and implement technical controls that will prevent the likelihood that users will be able to cause organizational information or capabilities to become compromised. These are the cybersecurity professionals that place the majority of their focus into mandating and measuring compliance requirements. After all, isn’t the best solution one where nothing will happen when a user clicks on a link or opens an email attachment?

When technical controls are gradually piled onto a system, it becomes more complex. The more complex the system becomes, the less resilient it is to unexpected incidents. When a software program, system, or network is developed, it is purpose-made and has just the parts and functions necessary for it to operate. As more technical security controls are implemented, there are more moving parts that must work together in order for the same functionality to be achieved. Just as in machines of the past, the more moving parts the gadget has – the more opportunities for it to malfunction. The same goes with information systems and their associated network assets.

That isn’t to say that user training and cybersecurity culture enhancements exist without a serious need for improvement. A standard annual security awareness training course is not enough to have a lasting and impactful effect on the organization’s culture. A multi-pronged approach to culture improvement might include that awareness training, specialized classes on how to protect home systems and family information, strong and frequent management support of the cybersecurity program, and creative ways to get users talking about security. One idea for how this can be accomplished is through a simple security question that pops up as soon as a user logs in at the beginning of their work day. Even if they somehow cheat in getting the answer, it can create a topic of discussion in the office for at least a few minutes each day.

So now that the different philosophies have been identified and analyzed, the question becomes, “which is right?” The answer is probably just a reluctant “yes”. Technical controls do a fairly good job at protecting against known threats and vulnerabilities, but cannot guard against unexpected incidents such as zero-day exploits, social engineering, and malicious acts that are inflicted by authorized users. It often comes down to the level of trust that is placed on educated users or the information technologies. It is going to take a serious focus on a combined approach to get the best of both worlds.
Steve P. Higdon has been working in the information security field for over ten years, providing support and consultancy to several public and private sector organizations. Steve holds several industry certifications and can be reached via email at infosec@stephenhigdon.com and on Twitter at @SteveHigdon.