Christina Camilleri is a security analyst at Bishop Fox, a security consulting firm providing IT security services to the Fortune 500, global financial institutions, and high-tech start-ups. Although she works on the “right” side of the security business, she is not only interested but also highly skilled in penetration testing and red teaming – assuming the role of a hacker in security exercises.
Social engineering is one of Camilleri’s stock weapons. It is a combination, she says “a blend of science, technology and art that tucks into basic human emotions”.
By studying body language and other attributes, Camilleri says it’s possible to manipulate people. This is because people can make judgement calls that can be manipulated so they get computers to carry out actions they weren’t meant to do.
This creates a situation where even the best technology can be thwarted if a person can be convinced to do something unanticipated by the system designer.
“When you think of this from a risk standpoint we’re at a point when we’re doing a good job of intercepting the vulnerabilities. Similarly, humans are trained and audited. But when we put that all together we’re missing one important point. We’re putting together something that can make judgement calls and has emotions with something that can only follow instructions and has no emotions”.
The consequence, said Camilleri, is we can coerce the people in front of the computer to do things for us.
“The problem is we’re focussing on the technical and social risks separately,” said Camilleri. “We should be assessing them together”.
To illustrate the point, Camilleri described a recent engagement with a client in the financial services sector. The test began with a phishing attack targeting many users. The premise of the email was that employees were being moved to a new health insurance provider and they needed to log into a portal to update their details. Camilleri’s team knew the company was changing insurer so the timing was appropriate for this attack.
As a result, Camilleri could access a single user’s account. From here, she could send emails from that user’s account. By this time, the phishing attack had been identified and users were warned. So, using the hacked account, Camilleri sent an email to staff, asking them to download a “patch” to protect against the phishing attack. This gave access to several other machines.
Without running any special scanning tools, Camilleri then identified Websphere MQ on a fileshare. After accessing that, she found her way to several user acceptance testing environments that were connected to some Jenkins servers that gave her a gateway into production systems.
From a single compromised system, Camilleri eventually had access to a massive swathe of corporate data. In several instances, she found there was no authentication in place and it was trivially easy to move from system to system.
The problem she said, was the assumption of trust of all users inside the network. There is a need, she said, to protect all inside, as well as outside, barriers.
“We need to take a proactive approach. We shouldn’t wait for something bad to happen to care about security”.
In a second client engagement Camilleri described, she called the target’s helpdesk, posing a computer user who had forgotten a password. After calling the helpdesk, and making life very hard for the support engineer, Camilleri eventually convinced the technician to not only reset her password manually and tell her the password but to also provide her with a bypass to the company’s VPN solution for remote access.
Needless to say, Camilleri and her team were able to use this to compromise systems and gain unauthorised access to data.
In this case, an overly helpful help desk operative was coerced into not only giving a password over but to bypass critical security controls.
The lesson here is to not give humans, who can make poor judgement calls, the ability to bypass controls.