Top cloud security controls you should be using

Human error is one of the top reasons for data breaches in the cloud, as administrators forget to turn on basic security controls. Whether it is Amazon Web Services, Microsoft Azure, or Google Cloud Platform, keep these rules in mind to secure your cloud workloads.

Another day, another data breach because of poorly configured cloud-based systems. The latest incident, in which up to 6 million customer details for Verizon’s United States customers was exposed, is yet another reminder both the cloud provider and the organization share the responsibility for cloud security.

There is a misconception that the cloud service provider is in charge of securing the cloud environment. That is only half the story. Cloud security providers such as Amazon, Microsoft and Google take care of security for their physical data centers and the server hardware the virtual machines run on, but leave the individual customer in charge of protecting the virtual machines and applications. Cloud providers offer an array of security services and tools to secure customer workloads, but the administrator has to actually implement the necessary defenses. It doesn’t matter what kind of security defenses the cloud provider has in place if the customers don’t protect their own networks, users and applications.

A third-party service provider handled Verizon’s back-office and call center operations and stored all customer call data, which included names, addresses, phone numbers, and account PIN codes of every Verizon customer that called the call center over the past six months, in an Amazon Web Service (AWS) Simple Storage Service (S3) data store. The data collection was meant to help improve customer service experience, but because the S3 bucket was incorrectly configured to allow external access, anyone patient enough to work out the web address would have been able to download the information. Scammers who got their hands on the data would be able to pose as an any Verizon customer on a call and gain access to customer accounts.

This kind of mistake is distressingly common.  Recent research by cloud security company RedLock’s Cloud Infrastructure Security team found that 40 percent of organizations have inadvertently exposed at least one public cloud service due to misconfiguration.

Misconfiguration is a serious problem

Verizon is just one of many organizations whose data was exposed on public clouds by mistake. Just a few weeks ago, personal data of over three million wrestling fans were exposed online because the World Wrestling Entertainment (WWE) had an unencrypted database on an AWS S3 instance with no access control or password protection enabled. In June, the Republican National Committee confirmed personal identifiable information of 198 million registered United States voters--accounting for approximately 60 percent of voters--had been stored in plaintext on an open Amazon S3 storage server owned by data analytics firm Deep Root Analytics. Defense contractor Booz Allen Hamilton exposed 60,000 files belonging to the Pentagon, including sensitive files tied to a U.S. military project and half a dozen unencrypted security credentials, by storing the files on a public S3 instance.

“The problem is not that the cloud is insecure, but ultimately customers are responsible for securely configuring their networks, applications and data,” said Varun Badhwar, CEO and co-founder of cloud security startup RedLock. “Public cloud infrastructure such as AWS can be highly secure if configured correctly by organizations adopting such services.”

[Related: 10 interview questions for hiring cloud-literate security staff]

Cloud security company Threat Stack analyzed 200 companies using AWS and found that 73 percent had at least one critical security misconfiguration, such as letting unauthorized parties directly access the data, use the misconfigured object as part of bigger attack, and control the entire environment by logging into the AWS console. These breaches were the result of basic security negligence and non-existent IT policies, not the work of malicious adversaries.

Regardless of who is doing the provisioning--whether that is the IT administrator, developer, engineer or the security team-- too many people do not fully understand how to configure their cloud environments. Organizations can no longer treat the public cloud as any old place to store information, but incorporate the following security measures to ensure their cloud environments, applications, and data protected from unauthorized access.

1.  Know what you are responsible for

All cloud services are not the same, and the level of responsibility varies. Software-as-a-service (SaaS) providers will make sure their applications are protected and that the data is being transmitted and stored securely, but that is typically not the case with cloud infrastructure. For example, the organization has complete responsibility over its AWS Elastic Compute Cloud (EC2), Amazon EBS and Amazon Virtual Private Cloud (VPC) instances, including configuring the operating system, managing applications, and protecting data.

In contrast, Amazon maintains the operating system and applications for Simple Storage Service (S3), and the organization is responsible for managing the data, access control and identity policies. Amazon provides the tools for encrypting the data for S3, but it is up to the organization to enable the protection as it enters and leaves the server. Check with the provider to understand who is in charge of each cloud security control.

2. Control who has access

RedLock’s CSI found that 31 percent of databases in the public cloud are open to the Internet. In fact, 93 percent of resources in public cloud environments did not restrict outbound traffic at all. Nine percent of cloud workloads that were not load balancers nor bastion hosts were accepting traffic from any IP address on any port, which is a terrible idea. Only load balancers and bastion hosts should be exposed to the Internet.

The Verizon data breach happened because the S3 bucket was set to allow external access. This is unfortunately a common mistake. Threat Stack found that 37 percent of organizations in its research had S3 buckets that granted access to everyone.  Many administrators mistakenly enable global permissions on its servers by using 0.0.0.0/0 in the public subnets. The connection is left wide open, giving every machine the ability to connect.

In the case of AWS, S3 buckets should never have a public access policy.

Another common mistake is leaving SSH open, something that 73 percent of organizations did in ThreatStack’s analysis. ThreatStack also found that 13 percent allowed SSH connections directly from the Internet, which meant anyone who could figure out the server location could bypass the firewall and directly access the data.

Major cloud providers all offer identity and access control tools; use them. Know who has access to what data and when. When creating identity and access control policies, grant the minimum set of privileges needed and temporarily grant additional permissions when needed. Configure security groups to have the narrowest focus possible, and use reference security group IDs where possible.

Amazon VPC lets administrators create a logically isolated network within the AWS cloud to launch servers in virtual networks. This is one way to protect the production environment from the development and staging environments and keep data separate.

3. Protect the data

Another common mistake is to leave data unencrypted on the cloud. RedLock’s CSI found that 82 percent of databases in the public cloud are not encrypted. Voter information and sensitive Pentagon files were exposed because the data was not encrypted and the servers were accessible to unauthorized parties. Storing sensitive data in the cloud without putting in place appropriate controls to prevent access to server and protecting the data is irresponsible and dangerous.

Where possible, maintain control of the encryption keys. While it is possible to give cloud service providers access to the keys, bottom line, the responsibility of the data lies with the organization.

“It’s like trusting your home renovator with the keys to your home,” said. Mark Hickman, COO at WinMagic. “You expect all will be well, but you can never be 100 percent certain if they’re locking the door or the character of their subcontractors. So why take that risk in giving them access to your keys in the first place?”

Even when cloud providers offer encryption tools and management services, too many companies don’t implement it. Encryption is a fail-safe—even if a security configuration fails and the data falls into the hands of an unauthorized party, the data cannot be used.

4. Secure the credentials

As the OneLogin breach showed, it’s not uncommon for AWS access keys to be exposed. They can be exposed on their public websites, source code repositories, unprotected Kubernetes dashboards, and other such forums. Treat AWS access keys as the most sensitive crown jewels, and educate developers to avoid leaking such keys in public forums

[Related: Security hurdles in cloud collaboration applications, and how to clear them]

Create unique keys for each external service, and restrict access following the principle of least privilege. Make sure the keys don’t have broad permissions, as in the wrong hands, they can be used to access sensitive resources and data. Create IAM roles to assign specific privileges, such as making API calls.

Make sure to regularly rotate the keys. RedLock found 63 percent of access keys were not rotated in over 90 days. This gives attackers time to intercept compromised keys and infiltrate cloud environments as privileged users.

Don't use the root user account, not even for administrative tasks. Use the root user to create a new user with assigned privileges. Lock down the root account (perhaps by adding multi-factor authentication) and use it only for very specific account and service management tasks. For everything else, provision users with the appropriate permissions.

Check user accounts to find those which are not being used and disable them. If no one is using those accounts, there is no reason to give attackers potential paths to compromise.

5. Security hygiene still matters

Defense-in-depth is particularly important when securing cloud environments because it ensures that even if one control fails, there are other security features keeping the application, network, and data safe.

Multi-factor authentication (MFA) provides an extra layer of protection on top of the username and password, making it harder for attackers to break in. MFA should be enabled to restrict access to the management consoles, dashboards, and privileged accounts. Redlock found that   58 percent of root accounts do not have multi-factor authentication enabled. ThreatStack found that 62 percent of organizations had at least one AWS user without multi-factor authentication enabled.

6. Improve visibility

Major cloud providers all offer some level of logging tools, so make sure to turn on security logging and monitoring to see unauthorized access attempts and other issues. Amazon provides CloudTrail for auditing AWS environments, but too many organizations wind up not turning on this service. When enabled, CloudTrail maintains a history of all AWS API calls, including the identity of the API caller, the time of the call, the caller’s source IP address, the request parameters, and the response elements returned by the AWS service. It can also be used for change tracking, resource management, security analysis, and compliance audits.

Don't let mistakes result in a breach

Data breaches aren't always caused by outside attackers; sensitive data can be exposed but human error, too. Mistakes--forgetting to turn on something or thinking something was done but not verifying it--can leave the door wide open for attackers. Organizations need to regularly assess the security of their cloud environments, and also that of their vendors, suppliers, and partners. As the Verizon breach showed, the third-party vendor’s mistake becomes the organization’s headache.

The shared security model exists for a reason--no matter who is responsible for the security of the cloud workloads, the organization is ultimately responsible for what happens to their data.

Show Comments