Security and vulnerability assessment: 4 common mistakes

Uncovering problems and fixing gaps can go awry with these blunders. Here are examples of where vulnerabilities assessments go wrong

If you're running a robust security program, you're regularly conducting security and vulnerability assessments of your both your network and physical environments. But in the quest to uncover security gaps and vulnerabilities, slip-ups are often made, too, that make these efforts less effective at having a positive impact.

At this month's CSO40 Security Confab and Awards event in Atlanta, attendees heard from two expert security veterans about best practices for vulnerability assessment.

Roger Johnston is the leader of the Vulnerability Assessment Team at Argonne National Laboratory. He and his team are often charged with finding the vulnerabilities with physical security systems. Jerry Walters is Director of Information Security with OhioHealth, a regional not-for-profit hospital network headquartered in Columbus, OH. Walters and his team are responsible for the overall information security program including risk management, vulnerability management, incident response, governance and compliance for the organization.

Both Johnston and Walters come at the topic of vulnerability assessment with different ideas and outline these four common mistakes that security teams make in the assessment process.

Lack of vision

When a team sets out to create a plan for vulnerability testing, no idea, even the most far-fetched, should be off the table, said Johnston.

"I think a big mistake people make is shutting down ideas too early," he said.

That means during brainstorming and planning sessions, even the wildest, far-fetched scenarios should be considered.

[9 1/2 signs your vulnerability management program is failing]

Johnston said he's observed that creativity seems stifled by the presence of a manager in the room and the perception that security is too serious to float wild ideas for testing.

That's a mistake.

"The best ideas come late," said Johnston. "You're doing yourself a disservice if you shut down ideas too early."

Johnston also encourages all security practitioners to "think like the bad guys" if they want to really get at the most serious problems.

Letting compliance get in the way

As a security manager in the health care industry, Walter's work is obviously intricately tied to HIPAA.

"HIPAA is very non prescriptive. With HIPPAA the intent is go and do good. It's left open to interpretation."

Walters said as a result, there is a lot of speculation in the healthcare industry about HIPAA, as well as attempts to put more definition around how to apply it.

Johnston noted compliance laws often wreak more havoc and damage than good. He believes security teams need to give a certain amount of push back to be effective in vulnerability assessments. At least 30 percent of compliance requirements are bad news, he said.

"For example, there are requirements that guards have to go to their stations at set times during the day -- therefore making it completely predictable when they will be there."

This is the kind of requirement Johnston thinks a team should push back on --because it only sets the organization up for more vulnerability, rather than less.

"As a security professional you have two jobs: compliance and security," said Johnston. "Sometimes they overlap. You have to do what you can to make the overlap. A compliance auditor might be suspicious. If they are, push back. On the other hand, some parts of compliance are worthwhile. Take what you can from the good parts of compliance and run with it. Go above and beyond in the parts you agree with."

Bad reporting

Walters said after many assessments, he's had outside consultancies simply "drop off a three-ring binder full of problems and leave."

This is a perfect example of bad, ineffective reporting.

"We want people to shake the trees," said Walters. "But if the reporting just focuses on the problems, they are not providing answers."

Johnston thinks mistakes in reporting come when teams are too critical of mistakes they find in assessments.

You likely may find a lot of mistakes being made. That's OK. Security is hard. But you don's have to fire anyone. Instead of finding people to blame, focus on fixing the mistakes. Also, keep in mind that all risk management is ultimately subjective -- even when you're using numbers. I'm not opposed to assigning numbers, but don't go overboard with assigning them."

[IT risk assessment frameworks: real-world experience]

Failing to bring what you've learned into the corporate culture

You know what vulnerabilities the assessment uncovered, but do the employees in your organization?

Of course, there may be many things you can't disclose to them. But what can you share that brings the issue of security to the forefront for everyone? How can you invest them in being part of the solution to the problems?

"Most regular employees see security as compliance thing," said Johnston. "They don't see it as something relevant to them. We need to motivate regular employees and answer the question of 'What's in it for me?'"

Johnston suggests a conversation that includes not only lessons learned from the vulnerability assessments, but that also includes examples of headline-making security incidents in other organizations.

"You're trying to build a culture, not a department," he said. "Security is everybody's job. It sounds cliché, but I don't think that resonates in many organizations."

Tags Exploits / vulnerabilitiesArgonne National Laboratory

Show Comments