AusCERT 2013: Interview with Dr Lizzie Coles-Kemp

Dr Lizzie Coles-Kemp is a senior lecturer in the Information Security Group, Royal Holloway University of London. She is keenly interested in how social behaviours influence our attitudes to security. For example, in communities where Internet accounts need to be shared between family members, the security professional's assumption that one account and password identifies one person is undermined. CSO spoke to Dr Coles-Kemp about the nexus between social behaviours and information security.

CSO

The IT industry is infested with its jargons. On the other side, sociology and the social sciences have their own jargons. How do we define security in a way that crosses the two, and how do social “wetware” people think of security. How do we bridge the gap between those two perceptions?

Dr Coles-Kemp

I get the point about jargon. Different tribes and disciplines have their own language, and it's not the language that goes into common parlance. One of the foundation differences is that we [the IT industry] and we think about data flows, and data objects – that's how we're trained to think in information security.

When you work with people going about their everyday lives, the key thing is their relationships. The relationships they have with institutions, the relationships they have with their families, the relationships they have with friends.

If I want to talk about information, that doesn't work so well. If I talk about the relationships they have, how they build up relationship networks, build up their social networks – then that becomes more resonant with where they're going.

CSO

Is it that the IT industry misunderstands, when people talk about people in “people terms”, and the IT industry takes those words and interprets them in IT terms?

Dr Coles-Kemp

I don't think it is that – I think it's just the fact that they think in terms of tools and technologies, and those tools and technologies conceptualise people as blobs of data. They conceptualise people as information streams.

People in the IT industry are people too! They have those relationships. It's just that when we think about tools, we think about security, we think about it around data. It's just a different standpoint, when you think about the relationship perspective.

I'm one of the primary investigators on a project that's part of the UK Cyber Security Research Institute – I am leading a project called Cyber Security Cartographies, and what we're doing is we're using visual research methods to build up relationship maps for security managers, to see how they build relationships, influence people, influence controls.

Fundamental to security management is how you engage with people.

So I think it's more that the tools and the technologies are built around point-to-point, host communications, node-to-node communication, network-to-network communication.

If you swing it around from a more socially-centred side, look at people, how they relate and engage with other people, in the workplaces and in society, it's different. Our information flows are encased within those social constructs, not the data constructs.

CSO

And our information flows break the “people” model – because a person takes a datum out and flings it “over there”. We're this much more holistic thing, a really messy lumps to work with, and that is difficult to express as data.

Dr Coles-Kemp

It was described this morning in terms of “trade-craft” (by Dmitri Alperovitch). That's a really nice point – when we can build attack trees, build those kinds of maps – security starts to become fixed.

But people make choices: how do they feel when they make choices about security? How do they feel when they walk in the building in the morning, or when they engage with the tax office, when they engage with social services.

How they feel at a particular point will mean they deploy certain aspects of their trade-craft.

CSO

What Dmitri was saying is that a person's individual behaviour, even as a black-hat, is individual. He will attack through this vulnerability because that's the easy point he knows how to approach, and having done so, he will take his own individual steps.

How difficult is it to get such rubbery concepts into our definition of security?

Dr Coles-Kemp

It also comes down to our individual language. That's one of the reasons that we work with artists [in Dr Coles-Kemp's work to engage with the community over cyber-security]. I'm working with a number of artists back in the UK. What we do is look at “communities of practise”. That's an established way of understanding how people engage their trade-craft when they have their communication …

CSO

I hate to interrupt - “community of practise”, you said. Let's define that.

Dr Coles-Kemp

It's around how, in your particular social setting, how you share and exchange, and build relationships. The information you share, your information practises, – that will be influenced by the relationships you have with people, the environment you're in. And all those routines and cultures.

CSO

Thinking of “information security” rather than just IT security: groups within companies are more likely to entrust each other with information, regardless of policy. I know I can trust this person, so I will tell her this, because I trust her.

Dr Coles-Kemp

Exactly. I used to work as an auditor, and we would say that different departments and different sub-cultures would have different sets of information practises. And you would start to see those, and start to visualise those patterns of practise.

It's not fool proof, of course, because people have their individual ways.

CSO

As you said in your speech [to AusCERT], in some communities, a user ID and password doesn't identify a person, because people do share those sorts of things and they become useless. Not to be pessimistic or hostile, but in a business, you can see ways in which those patterns of communities can indicate where information might leak in spite of our best security.

Dr Coles-Kemp

Not all divergences from policies are bad. This is part of the cyber-security project – where you have spheres of influence, and where you influence behaviours, and how we can identify where those sub-cultures are.

Because also, if I'm an attacker,  – I know my adversary quite well. If I'm a good attacker, I know when they're not motivated, and what workarounds they're likely to implement.

That gives you the gap, something I can lever open.

And in our work with the community, we see individuals deploying “trade-craft” in how they deal with institutions. When we talk to people out in the community, when we hear about how they think —– about social services, benefits or welfare— – they are making extremely astute, smart assessments of what the person is doing, how they're going to process what they're saying.

CSO

They know what they're dealing with,. tThey've got 10,000 hours of experience dealing with these people.

Dr Coles-Kemp

And if they don't play it right, they struggle. To understand that those communities of practise, how they evolve, how you influence them— – it's a word of mouth network. , hHow you get that working, how you change practises, is something that good security managers learn how to do.

So the question is how can we augment that, from an organisation's perspective..

CSO

The social matrix that's so much more complex than the tech— – are there ways in which that social understanding can tell us “there are ways in which we can get the technology right”.

Dr Coles-Kemp

I think so, yes. I don't think it's one or the other.

In my work in the information security group, I'm the only socially-oriented researcher there. The rest are all mathematicians and scientists. And I don't think it's an either-or conversation.  – I think the more we understand about what people do and what their issues are, the more we can think about where the technology comes in.

For example, what do delegation rights look like in a complex family situation?

It's a long time since we delegated the coercion discussion: what does coercion look like? We've thought about it in banking, we thought about it in the Second World War, where telegraph operators would indicate with a code that they were being coerced.

But we've done very little around coercion, and I think there's definitely some mileage in looking at that.

Another thing that would be useful, from a social perspective —– I might think that that an act is coercive today, . Bbut I might not have thought it was coercive yesterday or the day before yesterday. Because events of today will re-cast how I think about the past.

So what I think also could be interesting— – technologies that enable us to model long events with different sentiments and different perspectives. They would be useful tools.

AusCERT 2013 : Day 1 Coverage

AusCERT 2013: Users, cats more likely hack culprits than cyber-espionage: Trustwave

AusCERT 2013: Home-electronics gear’s UPnP as insecure in Australia as rest of world: Metasploit

AusCERT 2013: Big data skills help beat the bad guys, says HP

In pictures: AusCERT 2013 Day One

Dell targets ANZ security opportunities as SecureWorks debuts locally

AusCERT 2013: NBN users need security professionals’ help, says Google

AusCERT 2013: Day 2 Coverage

AusCERT 2013: Police urge banks to install ATM chip technology 

AusCERT 2013: World needs debate about “hack-back” rules of engagement

AusCERT 2013: Ashley Deuble: Network Security Monitoring with Security Onion

AusCERT 2013: What's it like to be a 'Nigerian scam' victim?

AusCERT 2013: Cloud-based scanner identifies new malware by its ancestry

AusCERT 2013: Low-level analysis can find, map data deleted from Android phones

AusCERT 2013:Perimeter protection has failed, encryption needs its day in the sun

AusCERT 2013: Packetloop looks at the half-life of security information

AusCERT 2013: AusCERT organisation celebrates 20 years

AusCERT 2013: Day 3 Coverage

Tags AusCERT 2013#Auscert2013DrLizzie Coles-Kemp

Show Comments