Policy scope creep and the proliferation of distributed network and application elements have fuelled confused security policies that have left businesses unable to limit the lateral movement of malware within their networks, a security expert has warned while highlighting the benefits of virtual microsegmentation in tightening security controls.
For most companies, said VMware vice president of security products Tom Corn during a recent speaking visit to Australia, a lingering focus on perimeter protection had created blind spots that had left corporate data ripe for the picking for an outside intruder who managed to slip past those external defences.
Firewalls were valuable in blocking that infiltration but years of progressively more complicated firewall policies – tailored to accommodate changing application usage patterns – had left many companies with tens of thousands of firewall rules and no way to know which were redundant or even still relevant.
If an administrator started from scratch today and asked “'what controls should I put to stop traffic going into and out of the data centre',” Corn said, “I bet that firewall has 30 ruls and not 30,000. But when was the last time your organisation removed a rule in the firewall? It's like Jenga.” This practice – particularly as applications had migrated from being monolithic internal entities to becoming widely distributed, componentised concerns that span network boundaries – had perpetuated network insecurity and left cloud migrations exposed by historical network-configuration artefacts. “The majority of our investment is still going to prevent infiltration,” Corn explained, “But there are precious few resources going to how we address issues of exfiltration and compromised. There are 1000 ways to get inside an organisation, but it's extraordinarily difficult to place security controls inside an environment – and to architecturalise an environment.”
Decentralisation of contemporary applications had exacerbated the problem: “An application is a distributed service with web, database, and composed servers,” Corn said, “and you have thousand of these all commingled on a common generic infrastructure. This has driven relatively flat networks.” Methods for securing those networks “have nothing to do with networks or endpoints,” Corn said.
“They have to do with people and data. But we don't have handles inside our environment that allow us to segment our environment based on the functionality of our servers. [In observing previous breaches at his employers] it really struck me as to how little compartmentalisation is in enterprise environments.”
Architecturalisation was crucial to contain the lateral movement that is typically used by malware authors to navigate corporate network structures, from one server to another, by piggybacking on the access credentials of targeted network users. Given that this requirement was ubiquitous in a highly distributed application environment, Corn said, enterprises needed to be considering how they could regain control of their application environment by using virtualisation to build highly segmented and controlled application elements.
That 'microsegmentation' model would place application components in discrete, virtualised elements that are coordinated by a hypervisor capable of monitoring traffic and enforcing access rules externally to a compromised operating system.
This approach “provides a separate trust domain to be able to influence security in ways we couldn't do from inside,” Corn said. “It is not at the guest layer, but underneath it. And no virtual machine can talk with any other virtual machine without passing through this domain.” Microsegmentation enables tight control over access between applications and services that are otherwise left open to the depredations of infective malware, Corn said.
By limiting inter-segment access by default, administrators can limit interactions between those segments to a specific set of acceptable activities – transforming today's relatively flat network and application architectures into tightly controlled architectures.
Paired with the introduction of service-based encryption to manage inter-segment traffic, microsegmented network architectures can also protect against compromises such as network sniffing, which is favoured by many types of malware as a way of divining information about the environments in which they have embedded themselves.
“The laws of gravity are very different in a virtual world,” Corn said. “By using microsegmentation around applications, I can create a network of least privilege. The number of services that can cross the boundary is finite, so I have a much smaller attack surface – and I can hang my security defence on those boundaries.” “What's exciting about this is that the conversation ceaes to become 'how do we secure virtualisation?',” he added. “That's important, but it's not profound. What's profound is asking 'how do we use virtualisation to secure the things that really matter?'”