CIO

Interpreting Data

Author: Emmanuel Carabott, Security Research Manager, GFI Software

The recent article by Cristian Florian,Most vulnerable operating systems and applications in 2014', was a huge hit, generating hundreds of comments and often heated debates and criticism. His analysis was also picked up by leading tech publications, generating even more chatter. What stood out is that IT professionals are really passionate about their favourite platform. This is a huge positive in the IT world as it helps a platform evolve beyond the efforts of its official maintainers and the results benefit everyone.

Cristian was criticised that he had interpreted the data to favour one platform over another – which is not the case. The data Cristian referenced came from the U.S. National Vulnerability Database, a government organisation which serves as a repository, but the information provided comes directly from the vendors themselves. Even if the vendors decide to do some clever accounting and move things around in order to make their product look more appealing by fragmenting vulnerabilities in different subsections, the numbers by themselves are still considered meaningless.

There is no way to determine a platform's security based on raw data alone, as that data needs to first be put into context, applied to a scenario, and then, interpreted. Just because data shows that the average deaths per plane crash is 63, whilst the average death per car crash is about 0.6, it would be ingenuous to conclude that travelling by car is safer.

According to the NVD, in 2014, Linux reported 119 vulnerabilities whilst different Windows platforms had an average of 35. But this information tells us nothing about which platform is most secure.

No platform is more secure than any other. The reality is that security levels change from scenario to scenario. Think of the most insecure operating system, but don't stop there – assume that not a single vulnerability was patched. You would think this is a security nightmare, right? Well, not necessarily. Assume you're using this system disconnected from any network with no one but yourself having physical access to this machine. Also assume you adhere to the strictest policy of never connecting or installing any outside software or hardware to this machine. The supposedly weak operating system (security wise) is now one of the strongest security setups you can have even if the data shows a large number of vulnerabilities on this machine. This is why context does matter.

Another aspect to consider is that Microsoft is undeniably the most popular operating system but Linux is also the most diverse. Windows runs on a select few architectures whilst Linux runs, well… everywhere – inside your watch, TV, fridge, mobile phone – even inside your Large Hadron Collider if you're lucky enough to have one. All the different hardware adds to the complexity and that increases drastically the chances for vulnerabilities to develop.

The issue also goes beyond hardware diversity since Linux and Microsoft employ an opposing philosophy when it comes to developing their kernel. Linux use a monolithic kernel design while Windows opted for what is essentially a Microkernel design (strictly speaking Windows uses a hybrid kernel design in that it includes some features over and above what is strictly defined as a Microkernel, however, this is a lot closer to a Microkernel than it is to any other design.) What this means is that Linux have a lot more going on at the kernel level and thus naturally there is more surface area where vulnerabilities can occur.

Linux also tends to go for a lot more experimental development where new concepts are tested and, at times, implemented into the operating system itself. Take file systems as an example. Windows supports three different file systems whilst Linux supports a much larger number. If one person is using the standard ext4 file system does it really make their system less secure if the vulnerability was detected in the xfs file system implementation? Not really.

Finally, whilst Microsoft's installation of a particular Windows version will carry the same exact kernel, this is not the case with Linux. This is what makes grouping possible for the Microsoft operating systems but impossible to do for Linux. If Windows NT kernel version 6.3 has an issue then every single Windows using that kernel will be susceptible. On Linux it is not as straightforward. Different distributions provide different kernel configurations. Even the options chosen at install might result in different configurations. So whilst an issue affecting one particular kernel version on a Microsoft platform will affect everyone running it, this will not be the case on a Linux machine.

Perception is also important. Microsoft's OSes have, over the years, had a none-too-glorious reputation when it came to security and the company has done a lot to improve its reputation. With each new version new security improvements are added (such as running a new OS as regular users rather than as administrator). Still, many believe the products are just as insecure as they were years ago.

An individual's perception can get in the way of properly interpreting data. Many still believe Windows is much more insecure compared to Linux but software isn't static and no one knows this more than Linux users. They're aware of the stigma that their operating system of choice has for being hard to install. That was certainly true long ago. I still remember the first time I installed Linux. It came on about a dozen floppy disks and it took me days to get it up and running. Nowadays I could get my two year old daughter to install Linux as all she'd have to do is click one next button a couple of times. Yet I still often hear how Linux is just too complicated for the average user to install. This is when perception becomes 'reality' even though the data may show otherwise. And perception is hard to change.

So which platform is more secure?

Probably the one you're most familiar with as it's not just about the underlying platform but also about how it's configured and set up. A secure system is not achieved by selecting the solution that has the least issues. It is achieved by choosing a system you're familiar enough with to know how to properly secure it. A poorly configured Linux Server can be at risk as much as a poorly configured Windows Server or a poorly configured OS X server. I believe that security is not about doing what is considered as enough but about covering all the bases. One weak link is all it takes to compromise even the most secure and robust operating system.

So is this data completely useless?

Far from it! Such data is meant to stimulate and start a conversation about important issues. Issues such as trends for example, and how a specific operating system is trending. A larger number of vulnerabilities doesn't automatically mean the situation is getting worse. It could simply pinpoint a move towards a more rigorous tackling of security issues, as has been the case in the past few years.

When looking into the 119 Linux kernel vulnerabilities we can see how these vulnerabilities affect a huge array of different systems. This in turn tells us that it is very important to install only the components which are truly needed.

A high number of high security vulnerabilities shows the need for an aggressive patch management strategy with a number of patches that must be tested and deployed quickly whereas in an operating system where the bulk of vulnerabilities are on a Medium to Low severity rating, the strategy can be more lax, and one can afford to test for longer, before deploying to a live environment.

In conclusion, articles such as the one written by Cristian Florian are food for thought. Observations can be made, but there is a reason why no outright conclusions are drawn – the number of variables to consider is simply too high.

On a parting note, I want to explain why I decided to focus on Windows and Linux systems. There has been an age-old rivalry between the two products and the same concepts can be easily applied to Apple and any other operating system out there. Every product has its own eco system, which in turn has its own set of unique variables to consider when interpreting data.

About the Author

Emmanuel Carabott is a Certified Information Systems Security Professional (CISSP) and has been working in the IT field for the past 18 years. He joined GFI in 1999 where he currently heads the security research team. Emmanuel is also a contributor to the GFI Blog where he regularly posts articles on various topics of interest to sysadmins and other IT professions focusing primarily on the area of information security.