Cloud computing is a boon, but its vectors need to be kept on a short leash, says Mushegh Hakhinian, security architect at IntraLinks.
The pace of business today requires that critical information be accessible anywhere, anytime, often among both internal and external parties. Hosted services are an important tool to enable this, but while the tools facilitate communication, they bring additional risk and challenges to the firms that use them. Technology is rapidly evolving to meet this challenge.
One example of this new technology is the use of cloud-based services. Despite the benefits they offer – an enhanced security network effect, lowered costs, easy implementation and on-demand capacity – there are some significant impediments. After moving to the cloud, it can be difficult for organisations to demonstrate compliance. Many firms, to quickly capitalise on lower costs, move into the cloud without checking the compliance or security ramifications. IT teams feel they may lose control over location or access rules for data and the first audit often shuts down the project and indefinitely suppresses the appetite to try a cloud-based service.
So, to enable your company to gain the benefits of cloud computing while you keep your sanity, it is imperative when considering a provider both to ensure compliance and to pass your security audits. At a minimum, a cloud services provider must be able to account for the location of your data at all times. Make sure it is protected from theft while in their custody, both at rest and in transit. Keep audit trails to report who accessed data and when. Demand flexible data classification and authorisation schemes to give you control over the setup and who can access which data. Additionally, your provider should offer options for ‘stronger-than-password’ authentication and protection of data ‘in-use’.
Basic security considerations
The foundation of every service is the infrastructure on which it runs. Saving money by selecting a cut-rate data-hosting facility is not a good strategy if the target customer base is security-savvy or regulated, such as financial services or life sciences. Hosting providers must have security certification, ie SAS 70 type II and ISO 27001.
Disaster recovery sites should be of equal quality and certification, so that switching between facilities will not put the service provider or the customer out of compliance. The vendor’s network design must provide a means to separate customer data or provide the architecture for database and application tiers to enforce data boundaries. A good design will also make applications independent from data location, so physical files can be stored in any geographic location.
If a provider hosts customer data, then they take the responsibility to ensure its confidentiality and integrity. There is always the potential for leaks to happen, whether by accident or by malice. Providers must ensure lost data is unusable by unauthorised users. Cryptography is key and should be implemented in a way that one insider will not be able to compromise data confidentiality. Strong encryption algorithms and multi-tier key management systems are a must. Protecting information in transit is also important and typically requires SSL 3.0 or higher. Ensuring data cannot be compromised in transit is worth it.
People who handle customer data can be targeted and there is the possibility of someone trying to steal data that has been stored. Comprehensive background checks can help to eliminate weak links. Maintaining detailed audit logs helps in identifying and prosecuting data thieves. It may even prevent the loss, as unusual activity can trigger investigation and put the brakes on stealth attempts.
It is important to keep the vulnerability surface of the system to a minimum. By using cryptography, it is possible to reduce the data that needs protection down to the size of the key.
Not all business information needs the same level of protection. Marketing collateral needs to be accessed by all, while financials have to be kept behind strong locks. By properly setting user roles, you can streamline complex authorisation schemes and make user administration easier and less error-prone.
The measures described above are the basic or ‘table-stakes’ security necessary to launch a public cloud-based service, but, unfortunately, some popular cloud providers lack even those and the jury is still out on whether they will ever be able to bolt on enough security. Properly implemented security features such as cryptography and granular authorisation schemes attached to highly segmented data help achieve premium security.
Advanced security considerations
Most online applications require only an ID and password for login. The reason for this is not purely technical: users of modern online business applications are increasingly ‘consumer-like’. Strong authentication introduces major inconveniences for users, the equivalent of putting the vault door at the entrance to a bank. That is why it makes sense to organise data within an online application, so high-level security is used to protect only sensitive data and actions.
There are two choices: predetermine parts of the application that require higher security, eg banking transactions or access to intellectual property; or, allow the data owner to specify which information needs extra protection. This second option, though more difficult, is preferable, especially for multi-tenant applications. It brings us closer to delegating security decisions to people who know the most about the data.
Once it is downloaded, information can be lost or stolen and you would not even know about it. In properly designed secure online systems, it can be more secure to share an electronic copy – there are technologies that allow tight access control and even remote shredding.
To address this, highly secure cloud systems implement in-use data protection, where content is always encrypted, even when it is sent to the user’s browser. Readers cannot open it without the key. When the owner removes access permission, the encryption key is destroyed, digitally shredding the remote document. Some have implemented a Diffie-Hellman key exchange, where the key is calculated each time before use.
The cloud enables a critical business requirement to be implemented – secure information exchange. For that to take place, the infrastructure must be designed with built-in security, from basic encryption to multi-factor authentication in use-protected content. IT can regain control over security by ensuring it understands cloud security principles – and requiring strict adherence to those principles by cloud vendors.
- The Customer Edge Drives the Need for NaaS - June 25, 2023
- Blockchain Evolves And Secures - January 13, 2019
- Bessemer Ventures’ 2018 Cloud Computing Trends - February 25, 2018