Practically every business is running some kind of cloud network, meaning cloud security is vital. Here is our list of topics that you need to be in control of to stay safe.
High Availability – Including Threat Actors
One of the major benefits of the cloud is the high-availability it offers to your hosted resources. They are accessible from anywhere. And that’s great. But it also means your cloud-infrastructure is—inevitably!—internet-facing. That makes it easy for any threat actor to try to connect to your servers and services, to do port scanning, dictionary attacks, and reconnaissance activities.
Some of the security issues that need addressing for cloud infrastructure are the same as those for on-premise, traditional infrastructure. Some are different or include additional challenges. The first step is to identify the risks associated with your cloud infrastructure. You need to implement counter-measures and other responses and activities that reduce or mitigate those risks. Make sure you actually document them and rehearse them with all stakeholders present and engaged. This will form your overarching cloud security strategy.
Not having a cloud security strategy is like ignoring cyber security for terrestrial networks. Actually, it is probably worse because of the internet-facing nature of the cloud.
The particular risks that you face vary slightly depending on how your using the cloud and what mixture of cloud offerings you’re using: infrastructure-as-a-Service, platform-as-a-service, Software-as-a-Service, Containers-as-a-Service, and so on. And there are different ways to categorize the risks. We’ve gathered them together into coherent but generic risk groups. There may be some that do not apply to your exact use cases, but make sure that really is the case before you discard them.
Misconfiguration and Human Error
Errors through oversight, overwork, or simply not knowing any better still abound in organizations of all sizes. Forgotten items and missed settings cause system compromises every week. The massive Equifax breach of 2017 that leaked the personal data of over 160 million people exploited an out-of-date SSL certificate. If there had been a process governing renewable items and clear guidance on who was responsible for the process it’s probable the certificate would have been renewed and the breach would never have occurred.
Unsecured containers are found almost weekly by security researchers using tools such as Shodan, a search engine that looks for devices, ports, and services. Some of these breaches and exposures arise because people expect things to be secure by default, which is not the case. Once you’ve spun up your remote server you need to undertake the same hardening steps and security improvements as any other server. Patching is vital too. To maintain the integrity of the server’s defenses it needs to have security and maintenance patches applied to it in a timely fashion.
Applications, especially data stores and databases such as Elastic Search, need to be hardened after installation too. Default accounts need to have their credentials changed and APIs protected with the highest level of security that is offered.
Two-factor or multi-factor authentication should be used if it is available. Avoid SMS-based two-factor authentication, it is easily compromised. Unused API’s should be switched off if they are not needed, or blocked with unissued—and private—API keys to prevent their use. Web application firewalls will provide protection against threats such as SQL injection attacks and cross-site scripting.
Lack of Change Control
Related to configuration errors are vulnerabilities introduced when you change or update a working system. should be done in a controlled and predictable fashion. This means planning and agreeing on the changes, reviewing the code, applying the changes to a sandboxed system, testing them, and rolling them out to the live system. This is something perfectly suited to automation—as long as the development to deployment pipeline is suitably robust and actually tests what you think it does, as thoroughly as you need it to.
Other changes that you need to be aware of are in the threat landscape. You can’t control new vulnerabilities being discovered and added to the list of exploits the threat actors can use. What you can do is ensure that you scan your cloud infrastructure so that all currently known vulnerabilities are addressed.
Frequent and thorough penetration scans should be run against your cloud infrastructure. Finding and rectifying vulnerabilities is a core element of keeping your cloud investment secure. Penetration scans can search for forgotten open ports, weak or unprotected APIs, outdated protocol stacks, common misconfigurations, all the vulnerabilities in the Common Vulnerabilities and Exposures database, and more. They can be automated and set to alert when an actionable item has been discovered.
Account hijacking is the name for compromising a system by accessing an authorized person’s email account, login credentials, or any other information required to authenticate against a computer system or service. The threat actor is then at liberty to change the password for the account and to conduct malicious and illegal activity. If they have compromised an administrator’s account they can create a new account for themselves and then log into that, leaving the administrator’s account seemingly untouched.
Phishing attacks or dictionary attacks are common means of obtaining credentials. In addition to dictionary words and permutations using the common number and letter substitutions, dictionary attacks use databases of passwords from other data breaches. If any of the account holders were caught up in previous breaches on other systems and re-use the compromised password on your systems they’ve created a vulnerability on your system. Passwords should never be re-used on other systems.
Two-factor and multi-factor authentication will help here, as will automated scanning of logs looking for failed access attempts. But make sure you check on the policies and procedures of your hosting provider. You assume they’ll be following industry best practices, but in 2019 it was revealed that Google had been storing G Suite passwords in plain text—for 14 years.
Driving in fog is a thankless task. And administering a system without the low-level, granular information that security professionals use to monitor and verify the security of a network is a similar prospect. You won’t do as good a job as if you can see what you need to.
Most cloud servers usually support multiple connection methods such as Remote Desktop Protocol, Secure Shell, and built-in web portals to name a few. All of these can be attacked. If attacks are happening you need to know. Some hosting providers can give you better logging or more transparent access to logs but you must request this. They don’t do it by default.
Having access to the logs is just the first step. You need to analyze them and look for suspicious behavior or unusual events. Aggregating the logs from several different systems and looking through them over a single timeline can be more revealing than wading through each log individually. The only way to realistically achieve that is to use automated tools that will look for inexplicable or suspicious events. The better tools will also match and find patterns of events that might be the result of attacks, and which certainly warrant further investigation.
Non-Compliance With Data Protection Regulations
Non-compliance is the data protection and data privacy equivalent of system misconfiguration. Not implementing legally-required policies and procedures to ensure the lawful collection, processing, and transmission of personal data is a different type of vulnerability, but it is vulnerability nonetheless.
It is an easy trap to fall into, too. Data protection is obviously a good thing, and legislation that requires organizations to function in ways that safeguard and protect people’s data is also a good thing. But keeping track of the legislation itself is very difficult without specialist help or in-house resources with sufficient skills and experience.
Fresh legislation is being enacted all the time and existing legislation is amended. When the United Kingdom left the European Economic Union (EEU) on Jan. 31, 2020, UK companies found themselves in a curious position. They must adhere to the UK-specific version of the General Data Protection Regulation contained within Chapter Two of the UK’s Data Protection Act, 2018—for any data they hold on UK citizens. If any of the personal data they hold belongs to people residing elsewhere in Europe then the EU GDPR comes into play.
And the GDPR applies to all organizations, regardless of where they are based. If you collect, process, or store the personal data belonging to UK or European citizens one of those GDPRs will apply to you—it isn’t just UK and EU organizations that have to deal with this. The same model applies to the California Consumer Privacy Act (CCPA). It protects Californian residents regardless of where the data processing takes place. So it isn’t something only Californian organizations need to get to grips with. It’s not your location that counts. It’s the location of the person whose data you’re processing that counts.
California isn’t alone in addressing data privacy through legislation. Nevada and Maine also have legislation in place, and New York, Maryland, Massachusetts, Hawaii, and North Dakota are implementing their own data privacy laws.
This is in addition to the legislation in vertically-focused federal legislation such as the Health Insurance Portability and Accountability Act (HIPAA), the Children’s Online Privacy Protection Rule (COPPA), and the Gramm-Leach-Bliley Act (GLBA) should any of those apply to your activities.
If you gather information through a portal or website in your cloud infrastructure, or process data on a hosted server some of this mass of legislation will apply to you. Non-compliance can attract significant financial penalties in the case of data breaches, along with reputational damage and the possibility of class-action lawsuits.
Done Right, It’s a Full-Time Job
Security is a never-ending challenge and cloud computing brings its own set of unique concerns. A careful choice of hosting or service provider is a critical factor. Make sure you do thorough due diligence before formally engaging with them.
- Are they serious about security themselves? What is their track record?
- Do they offer guidance and support, or sell you their service and leave you to it?
- What security tools and measures do they provide as part of their service offering?
- What logs are made available to you?
When cloud computing is discussed someone usually offers this well-known soundbite: “Cloud just means someone else’s computer.” Like all soundbites, it’s a gross oversimplification. But there’s still some truth in it. And that’s a sobering thought.
This post was written by Dave McKay and was first posted to www.cloudsavvyit.com
Do you find this article helpful? Your Friend might too. So, please Share it with them using the Share button above.
Will you like to get notified when I post new updates? Then Follow me on any of my social media handles: Google News, Telegram, WhatsApp, Twitter, Facebook, Pinterest.
You can also drop your email address below if you wish to be notified by mail.