Do you create stupid users?

Most security awareness failings are actually failings of security programs.

A week doesn't go by where we read about some attack that is precipitated by bad user actions. At a specious level, it appears that it is another case of a stupid user doing something that they have been told time and time again not to do. Clicking on phishing messages are one example of that.

However, when I hear the stories, I always ask myself what could have been done differently on the part of the organization? Why was the user put in the situation in the first place? Was the awareness training good enough? When the user took their action, could the system have warned the user? Could the systems have stopped the damage caused? Were there multiple phases of the attack that could have warned the organization that the user might be targeted? Again, there are a lot of questions.

[ There is no substitution for in-house security professionals ]

The safety profession has been asking similar questions for years, and has achieved incredible success in reducing loss. This process saves many organizations millions of dollars a year, and in large organizations, possibly hundreds of millions of dollars a year. This is a major advantage that the safety profession has over the security awareness profession. Losses are much more obvious and easy to calculate. After all, when a person is injured or killed in a workplace accident, there are clear losses associated with insurance claims, workers compensation claims, etc.

Good safety professionals approach safety from two angles; environmental issues and awareness. Awareness in safety is very much like awareness in information security. You provide information to attempt to improve behaviors, or implement reward systems to encourage continued good behaviors. This is the subject for other articles. I will however say at this point that awareness is left to resolve the final 10% of all safety related losses.

That means that addressing environmental concerns can prevent 90% of safety related losses. By physically rearranging the environment and removing conditions that can lead to injuries, accidents are prevented. Some of these measures involve removing or accounting for blatant safety hazards. For example, safety goggles and hard hats account for flying or falling debris. Painting lines on floors to define the safe walking/transit areas takes people away from being too close to moving vehicles or walking into objects. Painting tool outlines where tools are stored not only ensures that tools are not lost, but also ensuring that they are placed in a way that the tools are less likely to fall and will avoid injury should someone reach for it.

[ Raising awareness quickly: Holiday tips and tricks ]

Clearly some efforts are more involved than others, such as those that evaluate and improve a complex factory. Frequently, the work involves sifting through potentially thousands of injury reports to determine the common sources of injuries and determine strategies to prevent them.

Assuming you take care of the environmental issues, the remaining injuries seem to result from either people failing to follow policies and procedures or carelessness/inattentiveness. As you can assume, those are generally the same reason for many information related security awareness failings.

For security awareness practitioners, addressing policy violations and carelessness/inattentiveness is the subject of other articles. For now though, it must be said that awareness practitioners should consider addressing environmental concerns as well.

Admittedly, environmental concerns are traditionally the responsibility of security practitioners who implement technology. That being said, a good awareness practitioner should work with the more technical peers to recommend and prioritize the countermeasures that will have the greatest impact on preventing users from taking actions that put the organization at risk.

For example, as it is common for a user to misplace a laptop computer, laptops should have labels that specify how to return the laptops. The hard drives should be encrypted. Software that allows for remote location of the laptop should be preinstalled. Clearly there should be basic policies that limit the removal of the laptop device from facilities.

Likewise, concerning password security, token authentication would significantly reduce the possibility of password compromise through social engineering, passwords that are easily guessed, or written down, etc. Likewise the system should prevent easily guessable passwords from existing on the system. Similarly, there should be processes in place that ensure that managers, guards, and other people verify that passwords are written down and available for compromise.

There is a saying from the great American philosopher Peggy Bundy from the TV show Married With Children, "If you give a monkey a gun, and the monkey shoots someone, do you blame the monkey or the person who gave the monkey the gun?" While it is not my intent to equate users to monkeys, when a security professional sees a user make a mistake, before they label the person, "a stupid user," they should first look to see if they proverbially gave the user the gun.

So as you go to create or improve your security awareness program, you need to consider how to first remove the opportunities that allow for a user to have an awareness-related failing. While it might not seem like it is your job, it will make your job 90% easier.

Ira Winkler can be contacted at www.securementem.com.

Tags social engineeringsecurity awareness(no company)

Show Comments