Tag Archives: compliance

Putting A Padlock On The Cloud

cloud security padlock Putting A Padlock On The CloudAsk anyone in IT what the biggest barrier to adopting cloud computing services is and the most likely answer is security.

As Shirief Nosseir, EMEA security product marketing director at CA Technologies, explains securing the cloud isn’t rocket science; cloud is just another environment in which security should be seen as an enabler rather than a barrier.

Many organisations perceive the adoption of any form of cloud computing to change a company’s risk profile, but unless a company is willing to increase its appetite for risk, its profile should not change regardless of whether it adopts a public or private cloud.

However, at this stage of market maturity, where cloud sourcing decisions are decentralised and proper policies and procedures are not adequately enforced yet, it is currently quite common for lines of business to bypass the IT organisation altogether and go out and acquire cloud services (particularly Software-as-a-Service), without thoroughly vetting them for security risks.

Before moving any part of the business to the cloud, organisations need to consider the different cloud deployment options available, including service models (Software-as-a-Service, Platform-as-a-Service and Infrastructure-as-a-Service), internal versus external hosting and public versus private deployments. They also need to understand their on-premise services that they already have and identify the candidates suitable for moving to the cloud.

Also, if considering external cloud services, evaluate the different providers and service level guarantees that they offer (similar to traditional outsourcing). Then as in any security area, by taking a risk-based approach that is contrasted with costs and business value, organisations can leverage a framework to help them make better informed decisions and keep control of their risk profile to stay in line with their risk appetite.

Today, there is no doubt that the cloud is here to stay. It is already widely adopted by many organisations and adoption will continue to rapidly grow. Given the reduced cost, increased flexibility and opportunities it brings, cloud computing is compelling for many organisations.

It is important for IT departments to be proactive and be quick to embrace the cloud model, as this is an opportunity for security to be seen as an enabler rather than a brake on the system.

The cloud offers an irresistible business case and executives will often not stop from consuming cloud services that they need, just because these services were not vetted enough for security – a statement that should raise few eyebrows from risk, security and compliance professionals.

However, this is a trend that many of us already see happening in organisations. For instance, in a CA Technologies sponsored cloud security survey some of the key findings show that 49 per cent of respondents said their organisation uses cloud computing applications without thoroughly vetting them for security risks, while 68 per cent of respondents said that their security leaders are not the most responsible for securing the cloud computing resources in their organisations.

It is also worth mentioning that business supporters of cloud computing often highlight the business’s ability to buy IT services themselves, bypassing their IT organisation altogether. IT organisations that will resist the move to the cloud will ultimately be made irrelevant.

As the cloud is just another computing model that needs to co-exist with other (traditional) platforms, organisations should not be creating new separate policies to secure the cloud. They need to look at their entire environment, including the cloud, and develop a coherent set of policies that cut through their entire infrastructure. Start with what they have and work to adjust them to accommodate the cloud model.

At the same time, it is clear that traditional security models are now going through an evolution in an attempt to keep up with the new order of things. Take the data sprawl issue as an example: one of the common cloud security challenges that organisations face is identifying what data is appropriate to process and move into the cloud.

Nowadays, as data has transformed into bits and bytes, copying sensitive data or sending it across the globe is just a mouse click away. As we all know, this brought about new levels of efficiency and fuelled the democratisation of information.

On the flip side, we ended up with data sprawl. In most cases now, we have little control over how information is being used and shared and by whom it is being consumed. With the enormous amounts of information we process and share on a daily basis, we are not able to keep track of where all copies of our sensitive information are located. Needless to say, data sprawl has introduced all sorts of security problems, since we simply cannot secure what we cannot locate and control.

With cloud computing, data sprawl becomes even more of an issue. By nature, a cloud is highly dynamic, often extends beyond the typical boundaries of our organisation and typically is shared with other tenants. Clearly, traditional perimeter security cannot offer enough control over data and its movement to and in the cloud.

Although typical data loss prevention (DLP) technologies do a good job at locating, classifying and controlling information, they are simply not enough for what is truly needed. An identity-centric approach to information protection and control becomes paramount in cloud environments.

Content awareness (provided by DLP solutions) allows us to understand what information is held in our files and documents, whereas an identity-centric approach adds more intelligence to data sprawl and brings in the context of who is trying to use the data and how they should be allowed to use it (e.g. email, copy, print, etc).

Consequently, DLP technologies need to become more identity centric and integrated with identity and access management (IAM) technologies. Conversely, IAM needs to become more content aware to provide the right level of control that fosters information sharing, while mitigating unnecessary risks.

In turn, a content-aware identity and access management approach is paramount to be able to effectively ensure that only appropriate data is moved into the cloud.

Source

Like it? Share it! facebook Putting A Padlock On The Cloudtwitter Putting A Padlock On The Cloudgoogle plus Putting A Padlock On The Cloudreddit Putting A Padlock On The Cloudpinterest Putting A Padlock On The Cloudlinkedin Putting A Padlock On The Cloudmail Putting A Padlock On The Cloudfacebook Putting A Padlock On The Cloudtwitter Putting A Padlock On The Cloudgoogle plus Putting A Padlock On The Cloudreddit Putting A Padlock On The Cloudpinterest Putting A Padlock On The Cloudlinkedin Putting A Padlock On The Cloudmail Putting A Padlock On The Cloud

Risk Management in Cloud Computing

risk management cloud computing 300x294 Risk Management in Cloud ComputingIn a troubled economy, cloud computing seems like a great cost saving alternative and it is. Whether in good times or bad, any pragmatic cost saving measure is a ‘good’ measure.

Google, Microsoft, IBM and all other known and unknown cloud providers offer today’s CIO an array of major cost saving alternatives to the traditional data center and IT department. The temptation to put things on/in the cloud and sit back can be extremely compelling. But like everything that appears too good to be true, cloud computing comes with a set of risks that CIOs and CTOs would do well to recognize before making the plunge.

Before we get into the specifics of how best to manage risk when planning to move assets to the cloud, let’s look at a few numbers to help us understand what the Joneses are doing. Is cloud computing already mainstream?

ISACA’s 2010 survey on cloud computing adoption presents some interesting findings. Forty five percent of IT professionals think the risks far outweigh the benefits and only 10 percent of those surveyed said they’d consider moving mission critical applications to the cloud. In a nutshell, ISACA’s statistics and other industry published numbers around cloud adoption indicate that cloud computing is a mainstream choice but definitely not the primary choice.

While some organizations have successfully moved part or all of their information assets into some form of cloud computing infrastructure, the large majority still haven’t done much with this choice. So we ask, is it premature for organizations to have a cloud computing strategy? Au contraire! The CIO who has not yet begun to think of a cloud strategy may soon be left behind. In most organizations, there are definitely some areas that could be safely and profitably moved to the cloud. The extent to which an organization should move it’s information assets to the cloud and take advantage of the tremendous benefits by doing so is determined by the application of a risk assessment framework to all candidate information assets. For this, it’s essential to understand the risks and then have a mitigation strategy each.

Who accesses your sensitive data:
The physical, logical and personnel controls that were put in place when the data was in-house in your data center are no longer valid when you move your organization’s information on the cloud. The cloud provider maintains its own hiring practices, rotation of individuals, and access control procedures. It’s important to ask and understand the data management and hiring practices of the cloud provider you choose. Large providers like IBM will walk their clients through the process, how sensitive data moves around the cloud and who gets to see what.

Regulatory compliance: Just because your data is now residing on a provider’s cloud; you are not off the hook, you are still accountable to your customers for any security and integrity issues that may affect your data. The ability of the cloud provider to mitigate your risk is typically done through a process of regular external audits, PEN tests, compliance with PCI standards, ensuring SAS 70 Type II standards to name a few. You are responsible to weigh the risks to your organization’s information and ensure that the cloud provider has standards and procedures in place to mitigate them.

Geographical spread of your data:
You may be surprised to know that your data may not be residing in the same city, state or for that matter country as your organization. While the provider may be contractually obliged to you to ensure the privacy of your data, they may be even more obliged to abide by the laws of the state, and or country in which your data resides. So your organization’s rights may get marginalized. Ask the question and weigh the risk.

Data loss and recovery: Data on the cloud is almost always encrypted; this is to ensure security of the data. However, this comes with a price – corrupted encrypted data is always harder to recover than unencrypted data. It’s important to know how your provider plans to recover your data in a disaster scenario and more importantly how long it will take. The provider must be able to demonstrate bench-marked scenarios for data recovery in a disaster scenario.

What happens when your provider gets acquired: A seamless merger/acquisition on the part of your cloud provider is not always business as usual for you, the client. The provider should have clearly acknowledged and addressed this as one of the possible scenarios in their contract with you. Is there an exit strategy for you as the client – and what are the technical issues you could face to get your data moved someplace else? In short, what is your exit strategy?

Availability of data: The cloud provider relies on a combination of network, equipment, application, and storage components to provide the cloud service. If one of these components goes down, you won’t be able to access your information. Therefore, it is important to understand how much you can do without a certain kind of information before you make a decision to put it on the cloud. If you are an online retailer, and your customer order entry system cannot be accessed because your application resides on the cloud that just went down, that would definitely be unacceptable. It’s important to weigh your tolerance level for unavailability of your information against the vendors guaranteed uptime.

Cloud computing is relatively new in its current form, given that, it is best applied to specific low to medium risk business areas. Don’t hesitate to ask questions, and if necessary, engage an independent consulting company to guide you through the process. Picking a cloud provider requires far more due diligence than routine IT procurement. At this stage there is no clear cut template for success. The rewards can be tremendous if the risks are well managed.

Source

Source

Like it? Share it! facebook Risk Management in Cloud Computingtwitter Risk Management in Cloud Computinggoogle plus Risk Management in Cloud Computingreddit Risk Management in Cloud Computingpinterest Risk Management in Cloud Computinglinkedin Risk Management in Cloud Computingmail Risk Management in Cloud Computingfacebook Risk Management in Cloud Computingtwitter Risk Management in Cloud Computinggoogle plus Risk Management in Cloud Computingreddit Risk Management in Cloud Computingpinterest Risk Management in Cloud Computinglinkedin Risk Management in Cloud Computingmail Risk Management in Cloud Computing

Security in The Cloud: Top Issues in Building Users’ Trust

SkullKey 300x200 Security in The Cloud: Top Issues in Building Users TrustIT decision makers from a range of public and private sector organisations ranked loss of control of data and where data is held as the top security concern.

Despite economic pressure for business to cut costs and fervent assurances from cloud computing technology suppliers, security remains a top barrier to cloud adoption, research by the UK’s National Computing Centre (NCC) has revealed.

Interest in cloud computing is high and many organisations say they are planning to moving in that direction. But the reality is that only 20% of UK organisations are using infrastructure-as-a-service and only 36% are using software-as-a-service, according to the NCC research.

Building user trust in the cloud computing

The advantages of the cloud computing model of a reduced cost of ownership, no capital investment, scalability, self-service, location independence and rapid deployment are widely extolled, so what will it take to get businesses to adopt cloud computing en masse?

The short answer is that it all boils down to trust.

Trust is not easily defined, but most people agree that when it comes to cloud computing, transparency is essential to creating trust.

Businesses must be able to see cloud service providers are complying with agreed data security standards and practices.

These must include controls around who has access to data, staff security vetting practices, and the technologies and processes to segregate, backup and delete data.

Suppliers of cloud technologies and services are quick to claim that cloud computing is well equipped to provide the necessary controls. Virtualisation, they argue, underlies cloud computing, and therein lies the potential to achieve hitherto impossible levels of security.

While virtualisation is viewed with suspicion and fear by many IT directors, suppliers like RSA, IBM and other say that the technology enables organisations to build security into the infrastructure and automate security processes, to surpass traditional data protection levels.

Cloud computing cost savings obscure security issues

Aside from all the positive spin around cloud computing technologies, a trusted, standard model of cloud computing that will enable faster rates and higher levels of adoption is still a long way off, with relatively little progress being made in that regard in the past year, says William Beer, director of OneSecurity at PricewaterhouseCoopers (PwC).

Despite some isolated progress on the technology front, many organisations already using cloud-based services are motivated mainly by the cost savings they can achieve, and consequently pay little, if any, attention to security, says Beer.

“We are still being surprised by the weaknesses and lack of maturity in security models used by many of the cloud-based services on offer,” he says.

It will take a significant data breach by a cloud services provider, he believes, before consumers of cloud services will realise the inadequacy of current models and demand better safeguards around their corporate data.

2010 was a year of experimentation for cloud computing

During 2010, unexpectedly, the cloud went from a place for development and quality assurance, to a place for real production applications and data to live, says Gary Palgon, vice-president of product management as security consultancy firm nuBridges.

“This was primarily due to the cost savings to businesses and the tougher economy forced them there. The result is that the timeframe for acceptance of production applications in the cloud has accelerated,” he says.

However, Palgon recognises there is still some way to go before CISOs will readily accept putting sensitive company data in the cloud.

2010 was a year of experimentation and piloting for cloud computing, rather than one of full-scale implementations in the mid-market, says Bob Walder, research director at Gartner.

But, he says, dismissing cloud computing in 2011, because there is no high market penetration today, will lose IT providers a bigger opportunity two years from now.

In the short term, he says, IT providers should create cloud solutions that are viewed as extensions of existing IT environments.

On the other side of the equation, says Beer, all organisations should be looking at the benefits cloud computing can bring to their business.

“They should be looking at cloud, they should be looking at it today, but they should be looking at it cautiously,” he says.

Cloud computing must specialise by sector-specific security requirements

While the initial positive uptake, which varies from sector according to risk appetite, is mainly driven by cost, PwC believes that to move things on, cloud computing service providers will have to begin adapting to the specific security requirements of highly regulated sectors, such as financial services.

Service providers will also have to recognise that all UK organisations are obliged to comply with data protection legislation, policed by the Information Commissioner’s Office, which has steadily increasing powers of enforcement.

Initiatives will have to come from the service providers themselves, because progress on standards that depend on industry consensus is traditionally slow, says Beer.

RSA, the security division of EMC, has a vested interest in fostering cloud computing, and to this end, is planning to take a leadership position in planning to introduce a set of cloud-based services to be known collectively as RSA’s Cloud Trust Authority.

Lack of trust in cloud computing is slowing broader adoption of cloud services, RSA executive chairman Art Coviello told attendees of RSA Conference 2011 in San Francisco.

The aim of RSA’s initiative is to provide the tools organisations need to give them the necessary oversight of operations at cloud service providers, to assure customers that security service level agreements are being met and build the trust necessary of organisations to adopt cloud computing for mission critical applications and storage.

Best practices derived from initiatives such as these, says Beer, may give rise to cloud-specific standards, but again he points out that reaching agreements on standards, interoperability and third party certification programmes always takes time.

In search of cloud computing security standards

In the real world, some cloud computing service providers are turning to existing security standards such as ISO 27001, even though there is still much debate about whether its suitability to the cloud environment.

This approach is typically at the insistence of customers, says Beer, but is having the positive effect of making service providers see the commercial benefit of security standards, which may help build momentum in the industry.

A lack of common security standards and the ease of retrieving data if a change of supplier is required, were among the top security concerns among IT decision makers in the UK, research by the NCC revealed.

Using existing standards is a start, but Beer believes that ultimately the cloud computing industry will have to establish its own standards because the business model is fundamentally different as is the way users will engage with services.

But some progress is being made in this direction, says Gerry O’Neill, vice-president of the Cloud Security Alliance (CSA), UK & Ireland Chapter.

A great deal of effort has been dedicated over the past year to bring greater clarity and definition to questions of security and assurance in cloud services, he says.

In the public sector there are several examples of guidance and processes being developed for secure and appropriate use of Cloud services, says O’Neill. These include the UK Government G-Cloud project, ENISA Cloud Security Report, and the US Government’s FedRAMP Guidelines.

There have also been a number of industry-wide initiatives aimed at giving the necessary assurance to CISOs, CIOs and business managers to enable them to use cloud services with a degree of assurance which matches their organisation’s appetite on risk and compliance.

These initiatives include the Cloud Security Alliance, A6 (known as Cloud Audit), and the Common Assurance Maturity Model (CAMM).

For its part, says O’Neill, the CSA – formed 18 months ago to promote the use of best practices for providing security assurance within cloud computing – has been bringing together stakeholders around the world with the aim of progressing the definition of cloud security frameworks and guidance. The CSA has also developed the first recognised personal certification in the cloud security space, namely the Certificate in Cloud Security Knowledge (CCSK).

“By the end of 2011, PwC would like to see more consensus around standards, as well as an escalation of the security considerations of cloud implementations so they are considered as important by organisations as scalability, cost and technology,” says Beer.

Until organisations consider how security is built into the cloud computing models they are considering, they will always face significant data protection challenges, he says.

Organisations should expect service providers to be able to answer basic questions around their security model and provide indicators of what they are doing to keep information safe in the same way they can answer questions about technology, scalability and cost.

The role of consumers in determining the future of cloud computing

Consumers of cloud services also have a role to play in improving security in the cloud by applying all they have learned from outsourcing models and mistakes of the past and ensuring security requirements are built into contracts in the form of service level agreements.

Also, as with traditional outsourcing, organisations moving to the cloud should never lose sight of the fact they remain responsible for their data and cannot shift blame to their cloud service provider if things go wrong.

The NCC research found almost a quarter of organisations polled had experienced security incidents involving the service provider’s staff. Corrupt data affected 20% of respondents, 17% suffered data loss and 7% had data stolen.

Steve Fox, managing director for the NCC, says that as it takes time to modernise legislation and standards are voluntary, if cloud suppliers are to tap the latent demand for cloud computing services, they must not only address security concerns, but they must also improve existing service levels.

The way forward for cloud computing

The natural progression, says Palgon, is from keeping applications and data on-premise; to running applications in the cloud while still keeping the sensitive data locally; and finally to running applications and storing sensitive data in the cloud.

Some organisations are currently in the second phase, with some security suppliers enabling this hybrid approach by putting tokens in the cloud so the data vault can still be on-premise.

Commentators generally agree most organisations will eventually arrive at the third stage where all applications and data are in the cloud. But Palgon says this will happen only when the third-party data security companies have the credibility to store the data safely.

“We will first have to arrive at the situation that we can store data in the cloud with the same confidence that we store money and other valuables in banks today,” he says.

Moving into this third phase will mean the full vision for cloud will have been achieved, with cloud service providers able to store information more securely than individual organisations can themselves in the same way as banks can store money and other valuables more securely than its customers at a reasonable cost.

In other words, CISOs will accept putting sensitive data in the cloud only when service providers can guarantee better security than their own organisation can, or the same level of security at a lower cost.

The beauty of cloud computing, says Beer, is that service providers will be able to attract, retain and train to the right level much larger security teams than most business organisations would have internally.

In the absence of fully trusted cloud-based service providers that enable complete visibility of operations in compliance with established standards, the status quo of hybrid operations that pull together on-premise, private and public cloud systems is likely to continue.

Businesses will continue to use cloud computing services according to a risk-based model, putting as much as they can into the cloud to cut costs, but keeping high-risk data on premises to maintain the highest level of control and visibility over this data.

In the year ahead, the CSA’s Gerry O’Neill predicts a marked and steady increase in the uptake of assured cloud services as stakeholders engage to hammer out certifications.

A high degree of co-operation and partnering will help prevent the unwanted proliferation of a myriad of unrelated initiatives or compliance frameworks, which has got to be good news for the over-audited and compliance-weary CIO, he says.

Source

Like it? Share it! facebook Security in The Cloud: Top Issues in Building Users Trusttwitter Security in The Cloud: Top Issues in Building Users Trustgoogle plus Security in The Cloud: Top Issues in Building Users Trustreddit Security in The Cloud: Top Issues in Building Users Trustpinterest Security in The Cloud: Top Issues in Building Users Trustlinkedin Security in The Cloud: Top Issues in Building Users Trustmail Security in The Cloud: Top Issues in Building Users Trustfacebook Security in The Cloud: Top Issues in Building Users Trusttwitter Security in The Cloud: Top Issues in Building Users Trustgoogle plus Security in The Cloud: Top Issues in Building Users Trustreddit Security in The Cloud: Top Issues in Building Users Trustpinterest Security in The Cloud: Top Issues in Building Users Trustlinkedin Security in The Cloud: Top Issues in Building Users Trustmail Security in The Cloud: Top Issues in Building Users Trust

Unlocking the Promise of Cloud Computing

auto industry cloud computing 300x225 Unlocking the Promise of Cloud ComputingCloud computing can help automotive manufacturers gain a competitive edge in every aspect of their business—from product design and manufacturing to global expansion. But, success will largely depend on their ability to fully leverage its capability in response to several emerging trends and challenges.

When assessing what cloud can do for their businesses, automotive leaders need to take into account the distinct and rapidly evolving challenges that their industry faces today. These include the fundamental and ongoing changes in the way that automotive companies communicate and transact with their customers; the need to capture, manage, protect and analyze their ever-expanding collection of customer data; the requirement to decrease their information technology (IT) operating costs while upgrading their capabilities; and the need to expand into new and emerging markets at low cost.

Moreover, companies are facing an expanding multitude of regulations around issues including environmental protection and in-car safety, while rising commodity and raw material prices are also impacting profitability.

Many companies are realizing that cloud computing can represent the next progression from traditional enterprise resource planning (ERP)-style systems. These ERP systems provide most of the management processes and applications used by original equipment manufacturers (OEMs). These may not need to be as flexible as networked, Web-based platforms increasingly used in other industries for activities such as supply chain management and collaboration. As a result, automotive companies are seeking low-price opportunities to upgrade their IT capabilities, while reducing the operating costs of those that will remain. Cloud is being considered more often for this step.

Cloud feasibility

The cost savings and operational flexibility that cloud computing offers can help automakers respond to these and other industry challenges. However, it will be important that companies do not take the potential benefits of clouds at face value, but rather perform a thorough assessment of how best cloud computing can aid them.

First, companies should determine how they will manage cloud capabilities with their existing legacy systems to produce seamless operations. Many are considering this strategy to reduce IT infrastructure costs and increase responsiveness in the marketplace. As customers engage automakers and their collaborative partners through multiple channels, such as In-Vehicle Infotainment (IVI) services, mobile devices and social networking sites, the ability to be more responsive will become critically important.

This will extend to satisfying growing customer expectations for better and differentiated services based on data provided to automakers who are aiming to improve the customer experience. Real-time analytics that can provide predictive analysis for those services will require a great deal of data and computing power that may be well served by cloud computing.

Security and data privacy are concerns that can and must be satisfied to ensure a smooth transition to the cloud. For companies using cloud computing, it will be essential to work with the provider to ensure that it can achieve parity or better levels of security, privacy, and legal compliance than the company currently possesses. The provider also should be required to give a risk assessment and describe how it intends to mitigate any issues found.

Finally, companies will need to look closely into the costs of cloud computing. This should include reviewing rigorous return-on-investment case studies based on actual usage. Savings estimates are not enough. Potential purchasers must evaluate different kinds of cloud services pricing models and develop an effective approach for measuring the costs and return from clouds.

While it is important to take precautions, it is also important to understand that the relatively low capital investment, quick deployment and fast return on cloud services make their widespread industry adoption more a case of when, not if. To avoid missing a distinct competitive advantage, automakers should seriously evaluate cloud computing.

Source

Like it? Share it! facebook Unlocking the Promise of Cloud Computingtwitter Unlocking the Promise of Cloud Computinggoogle plus Unlocking the Promise of Cloud Computingreddit Unlocking the Promise of Cloud Computingpinterest Unlocking the Promise of Cloud Computinglinkedin Unlocking the Promise of Cloud Computingmail Unlocking the Promise of Cloud Computingfacebook Unlocking the Promise of Cloud Computingtwitter Unlocking the Promise of Cloud Computinggoogle plus Unlocking the Promise of Cloud Computingreddit Unlocking the Promise of Cloud Computingpinterest Unlocking the Promise of Cloud Computinglinkedin Unlocking the Promise of Cloud Computingmail Unlocking the Promise of Cloud Computing

Virtualization 2.0: A Foundation for Successful Cloud Computing

virtualization servers datacenter 300x245 Virtualization 2.0: A Foundation for Successful Cloud ComputingVirtualization – not quite the nirvana it was promised to be. We expected exponentially better efficiency, higher availability and huge savings for IT budgets. However, now that the honeymoon is over, most organizations feel slighted. Not only have the promised benefits never been realized, but IT organizations also have been saddled with ever-increasing user demand and out-of-control costs – not to mention virtual sprawl, vendor lock-in and high provisioning effort.

With all of these issues, enterprises are looking to solve the problems of “Virtualization 1.0.” Last year, a Gartner study showed that CIOs were looking to cloud computing in more strategic ways, in the hope that the cloud will improve IT operations.

So cloud computing will fix all this, won’t it?

Actually, cloud computing will just compound the problems of virtualization unless we adopt a new management model because the problems of Virtualization 1.0 largely stem from a single undeniable fact: The average human brain cannot keep up with the complexity of a virtualized environment.

In every virtualized IT organization, there is a smart guy, or group of guys, spending a significant portion of their time provisioning virtual machines (VMs). While provisioning a VM is conceptually simple, there is a vital decision to be made: On which physical machine should the VM run? The importance of this simple question cannot be underestimated, and it can be a really complex task to determine the right answer.

Let’s start with the easy stuff: Which physical machines have the capacity to run the workload? Which are running the right hypervisor? Now, here are the harder questions: On which physical machine would the workload most efficiently fit (perhaps you have 1,000 of them)? Which machines have been reserved for a particular task (perhaps because of their high cost or particular configuration)? Are there any special security or governance requirements that limit where this VM can be geographically placed? And now for the killer: Is there anything already running on the physical machine that would cause a compliance issue if we place the new VM there?

You are getting the idea, but we are not done yet. Remember, this needs to be done every time you provision a new VM. But since VMs come and go, you really need to do it every time you restart a VM.

Even if your guys are all Einstein, this is going to be practically impossible. And even if you could do it exactly right, every time, there’s another problem: High-end Virtualization 1.0 solutions include features like high availability and resource scheduling that move VMs automatically – and break everything you just worked out.

Far from fixing it, cloud computing just makes this problem exponentially worse. More machines, more locations and more people provisioning machines equals more complexity. Far from being the enabler of the cloud, virtualization becomes the inhibitor.

How do we solve the problems of Virtualization 1.0?

Virtualization strategy needs to evolve past relying on humans to make each deployment and management decision ad hoc. Enterprises need automated, business-policy-driven provisioning and management. Virtualization 2.0 is that evolution and is built upon three key foundations: separation, delegation and allocation.

Separate the physical and the virtual, separate the application team from the IT infrastructure organization. IT contributes compute, network and storage resources to a resource cloud, and virtual enterprises (a logical unit of users) consume resources from it. Virtual enterprises never access the physical layer and they neither know nor care from where their resources come. IT maintains control of the physical infrastructure and can give multi-tenancy control to various aspects of the virtual infrastructure to authorized users.

Delegate self-service provisioning to virtual enterprises in complete safety because of that separation. Virtual enterprise users access image libraries to spin up pre-configured corporate images that maintain company standards. IT no longer needs to spend days or weeks provisioning according to user demand.

Allocate
resources to self-service virtual enterprises according to business policies. When a new VM is created (or restarted), the policies determine how that VM is deployed. For example, the CIO sets a policy for the compliance rules his enterprise must follow; that enterprise’s VMs would be automatically deployed based on that policy. Or let’s say the CIO wants only the most expensive hardware used for certain applications – IT sets a policy to make sure the VMs are automatically deployed accordingly. The same could apply for a green policy or even performance. Policies ensure that VMs are deployed automatically according to security, compliance, efficiency, cost and performance rules.

How do we really benefit?

IT responsiveness skyrockets because the time that was previously devoted to provisioning can be used elsewhere. Value-add activities like capacity planning are now possible. Increased agility due to on-demand deployment enables development teams to test what-if scenarios. Utilization greatly improves and server efficiency can be optimized. Security and compliance concerns are mitigated because the system cannot deploy anything unless it adheres to policy. Virtual sprawl is minimized because virtual enterprises manage their VMs under resource limits – encouraging them to take down defunct machines to free up unused resources when they approach their limits. Users are empowered to control their own VMs, IT has better control over resources, and the CIO can control costs and budgets with business policies.

How do we actually implement this?

This kind of business-policy-driven automation is only possible with the right management tool, one that integrates with your existing management tools and is fully customizable to your needs. It should enable you to avoid vendor lock-in, which prevents your business from being as competitive as possible. Gartner Group reports that by 2012, 49 percent of enterprises expect to have a heterogeneous virtual environment. Enterprises will want to use the free hypervisors for non-critical applications but still be able to use the expensive hypervisors when necessary.

IT departments must be empowered with enterprise-class cloud management software built on open standards and the three fundamentals to manage their entire, globally deployed infrastructure in order to fully realize the benefits of cloud computing. Without the Virtualization 2.0 trifecta of separation, delegation and allocation, any cloud solution will suffer from the same problems of Virtualization 1.0. However, with the new model, the load placed on IT staff can be reduced and savings can be realized through dynamic provisioning based on policy and minimal management efforts.

Without the capabilities and policies of Virtualization 2.0 in place, CIOs may find their heads stuck where their data is not – in the clouds.

Source

Like it? Share it! facebook Virtualization 2.0: A Foundation for Successful Cloud Computingtwitter Virtualization 2.0: A Foundation for Successful Cloud Computinggoogle plus Virtualization 2.0: A Foundation for Successful Cloud Computingreddit Virtualization 2.0: A Foundation for Successful Cloud Computingpinterest Virtualization 2.0: A Foundation for Successful Cloud Computinglinkedin Virtualization 2.0: A Foundation for Successful Cloud Computingmail Virtualization 2.0: A Foundation for Successful Cloud Computingfacebook Virtualization 2.0: A Foundation for Successful Cloud Computingtwitter Virtualization 2.0: A Foundation for Successful Cloud Computinggoogle plus Virtualization 2.0: A Foundation for Successful Cloud Computingreddit Virtualization 2.0: A Foundation for Successful Cloud Computingpinterest Virtualization 2.0: A Foundation for Successful Cloud Computinglinkedin Virtualization 2.0: A Foundation for Successful Cloud Computingmail Virtualization 2.0: A Foundation for Successful Cloud Computing

In the Cloud, Governance Trumps Ownership

cloud Access Governance 300x279 In the Cloud, Governance Trumps OwnershipIn more than a decade of talking about cloud computing, I have found the principle of ownership has been a recurring theme. People feel comfortable owning their computing. They know where they stand. Since cloud computing means giving up ownership, it makes people uncomfortable, uncertain of their ground.

But while there’s comfort in ownership, it’s not of itself a guarantee of security or certainty. People often talk of the risks of trusting computing that lies “outside the firewall,” as if cloud computing providers don’t use firewalls. Of course they do, and in many cases, their firewalls are more robust and better policed than the average enterprise firewall. What the phrase really means is, “outside my firewall.” There’s an implicit assumption that it must be better, simply because it’s mine.

Even if I concede that it might not be the world’s most secure device imaginable, at least I know I can trust it. It’s sitting on my own premises, configured and managed by own staff, and up-to-date with my organization’s current security and access policies.

Or is it?

We use the term ‘on-premise’ to describe computing that’s within the domain of an organization. But it doesn’t always mean what it appears to mean. Many acres of so-called on-premise computing assets are actually deployed elsewhere, at co-location centers and facilities management sites. The organization trusts the operators of those third-party premises to control access and security.

In larger organizations, it’s not even safe to assume that staff working on your own site are direct employees. With many IT consultants and other administration staff either outsourced or brought in as contractors, the assumption that on-premise assets are configured and maintained by the organization’s own direct employees ignores the facts on the ground.

At least the organization still sets its own processes and policies. With proper procedures in place for ensuring everyone knows the rules and puts them into practice, you can be confident that the IT infrastructure is operating as it should and that any risks and threats are correctly managed.

And how do you do that?

The real reason we like ownership is that, whenever we need to, we know we can just walk in and make a hands-on assessment of the situation on the ground. If we’re honest with ourselves, that sense of direct, actionable accountability is probably covering a multitude of sins. We know there are times when our own people or our contractors, whether through lack of training, process flaws or sheer carelessness, get things wrong. We probably tolerate errors within our own organization that we would never accept from a third-party provider because we know we have the power to put things right to our own satisfaction if we ever need to.

Yet in a modern IT infrastructure, there are other ways of controlling proper policy and process. The technology allows us to instrument, verify and audit whether procedures are being followed correctly. Accountability, governance, compliance and problem resolution are no longer dependent on physical access. It can all be done electronically in real-time.

Using a third-party cloud computing provider can therefore be just as trustworthy and certain as relying on in-house resources, provided the instrumentation and governance of policy and process is as good. In practice, this is one area where public cloud providers did not begin well. Some providers espoused an arrogant mirror-image of the “it’s my firewall” mindset: “We don’t publish an SLA, you can trust us because we’re a big, friendly online brand.”

Fortunately, those attitudes are now being challenged. For customers willing to pay the extra cost, the current generation of cloud providers offers better transparency into processes, a more granular choice of policy settings and enterprise-grade instrumentation and reporting. Because investments are pooled across the entire customer base, a cloud provider can operate the technology at a larger scale and sophistication than most of their customers would wish or need to do individually.

There’s still some work to do to establish the process and policy stipulations it’s reasonable to demand from third-party providers. Enterprises must focus on specifying the results they want, rather than attempting to constrain the provider’s underlying technology and operational choices in unnecessary detail. But in principle, a proper governance infrastructure is capable of delivering more control from a third-party provider than most enterprises realistically have over what happens today within their own on-premise IT.

Ownership is not the critical factor here. What matters is having the right mechanism in place for proper accountability and governance.

Source

Like it? Share it! facebook In the Cloud, Governance Trumps Ownershiptwitter In the Cloud, Governance Trumps Ownershipgoogle plus In the Cloud, Governance Trumps Ownershipreddit In the Cloud, Governance Trumps Ownershippinterest In the Cloud, Governance Trumps Ownershiplinkedin In the Cloud, Governance Trumps Ownershipmail In the Cloud, Governance Trumps Ownershipfacebook In the Cloud, Governance Trumps Ownershiptwitter In the Cloud, Governance Trumps Ownershipgoogle plus In the Cloud, Governance Trumps Ownershipreddit In the Cloud, Governance Trumps Ownershippinterest In the Cloud, Governance Trumps Ownershiplinkedin In the Cloud, Governance Trumps Ownershipmail In the Cloud, Governance Trumps Ownership

5 Overlooked Threats to Cloud Computing

threats cloud computing 300x199 5 Overlooked Threats to Cloud ComputingA lack of understanding about security risks is one of the key factors holding back cloud computing.

Report after report after report harps on security as the main speed bump slowing the pace of cloud adoption. But what tends to be overlooked, even by cloud advocates, is that overall security threats are changing as organizations move from physical environments to virtual ones and on to cloud-based ones.

Viruses, malware and phishing are still concerns, but issues like virtual-machine-launched attacks, multi-tenancy risks and hypervisor vulnerabilities will challenge even the most up-to-date security administrator. Here are 5 overlooked threats that could put your cloud computing efforts at risk.

1. DIY Security.
The days of security through obscurity are over. In the past, if you were an anonymous SMB, the threats you worried about were the typical consumer ones: viruses, phishing and, say, Nigerian 419 scams. Hackers didn’t have enough to gain to focus their energy on penetrating your network, and you didn’t have to worry about things like DDoS attacks – those were a service provider problem.

Remember the old New Yorker cartoon: “on the Internet no one knows you’re a dog”? Well, in the cloud, no one knows you’re an SMB.

“Being a small site no longer protects you,” said Marisa S. Viveros, VP of IBM Security Services. “Threats come from everywhere. Being in the U.S. doesn’t mean you’ll only be exposed to U.S.-based attacks. You – and everyone – are threatened from attackers from everywhere, China, Russia, Somalia.”

To a degree, that’s been the case for a while, but even targeted attacks are global now, and if you share an infrastructure with a higher-profile organization, you may also be seen as the beachhead that attackers can use to go after your bigger neighbors.

In other words, the next time China or Russia hacks a major cloud provider, you may end up as collateral damage. What this all adds up to is that in the cloud, DIY security no longer cuts it. Also, having an overworked general IT person coordinating your security efforts is a terrible idea.

As more and more companies move to cloud-based infrastructure, only the biggest companies with the deepest pockets will be able to handle security on their own. Everyone else will need to start thinking of security as a service, and, perhaps, eventually even a utility.

2. Private clouds that aren’t.

One way that security-wary companies get their feet wet in the cloud is by adopting private clouds. It’s not uncommon for enterprises to deploy private clouds to try to have it both ways. They get the cost and efficiency benefits of the cloud but avoid the perceived security risks of public cloud projects.

Plenty of private clouds, though, aren’t all that private. “Many ‘private’ cloud infrastructures are actually hosted by third parties, which still leaves them open to concerns of privileged insider access from the provider and a lack of transparency to security practices and risks,” said Geoff Webb, Director of Product Marketing for CREDANT Technologies, a data protection vendor.

Much of what you read about cloud security still treats it in outdated ways. At the recent RSA conference, I can’t tell you how many times people told me that the key to cloud security was to nail down solid SLAs that cover security in detail. If you delineate responsibilities and hold service providers accountable, you’re good to go.

There is some truth to that, but simply trusting a vendor to live up to SLAs is a sucker’s game. You – not the service provider – will be the one who gets blamed by your board or your customers when sensitive IP is stolen or customer records are exposed.

A service provider touting its security standards may not have paid very close attention to security. This is high-tech, after all, where security is almost always an afterthought.

3. Multi-tenancy risks in private and hybrid clouds.
Many companies, when building out their private or hybrid clouds, are hitting walls. The easy stuff has been virtualized, things like test development and file printing.

“A lot of companies have about 30 percent of their infrastructure virtualized. They’d like to get to 60-70 percent, but the low-hanging fruit has all been picked. They’re trying to hit mission-critical and compliance workloads, but that’s where security becomes a serious roadblock,” said Eric Chiu, President of virtualization and cloud security company HyTrust.

Multi-tenancy isn’t strictly a public cloud issue. Different business units – often with different security practices – may occupy the same infrastructure in private and hybrid clouds.

“The risk to systems owned by one business unit with good security practices may be undermined by the poor security practices of a sister business unit. Such things are extremely difficult to measure and account for, especially in large, multinational organizations,” Webb said.

Another issue is application tiers. In poorly designed private clouds, non-mission critical-apps often share the same resources as mission-critical ones. “How do most companies separate those?” asked Chiu.

“They air-gap it, so the biggest threat for most virtualization and private cloud environments is misconfiguration,” he said. “Eighty percent of downtime is caused by inappropriate administrative changes.”

4. Poorly secured hypervisors and overstressed IPS.
Every new technology brings with it new vulnerabilities, and a gaping cloud/virtualization vulnerability is the hypervisor.

“Many people are doing nothing at all to secure virtualized infrastructures. The hypervisor is essentially a network. You have whole network running inside these machines, yet most people have no idea what sort of traffic is in there,” Anthony said.

Buffer overflow attacks have been successful against hypervisors, and hypervisors are popping up in all sorts of devices that people wouldn’t think of as having them, including Xbox 360s.

Even when organizations believe that they have a handle on the traffic within their cloud environments, they may be fooling themselves, especially if they are relying on legacy security tools. Everyone knows that they need an IPS solution to protect their cloud deployments, but they have no idea what the actual scale of the problem is.

Moreover, many of these appliances have packet inspection settings that by default fail on. In other words, if the device is overwhelmed with, say, video traffic, the majority of traffic passes through as safe and only small samples are inspected for threats.

The IPS will typically trigger a low-level alarm or record this spike in a log, but how many IT units have time to look at logs unless they know they have a problem? Organizations are also slow to realize that they need an array of different protection in virtualized cloud environments than they had in traditional on-premise settings. Or they do realize this and are choosing to ignore it due to budget and time constraints.

The IBM security executives I talked to at RSA ticked off a number of security solutions they would recommend to better protect cloud environments, including IPS solutions with 20 GBps capabilities, DLP and application security. Much of what their advice boiled down to (see item #1 again) is that security is becoming too big of a problem to tackle for most organizations on their own.

5. Insider threats.

Are insider threats keeping you up at night now? Unfortunately, virtualization and the cloud ramp up the risk of insider threats – at least for the time being.

“A smaller number of administrators are now likely to have access to a greater amount of hosted data and systems than ever before, as the cloud systems are managed by a cloud infrastructure management team. This can leave sensitive data open to access by individuals who previously did not have access to it, eroding separation of duties and practices and raising the risk of insider attacks,” Webb said. The ability to walk off with key assets is also simply much easier to do, rights or not, in a virtualized environment than a physical one.

“When the banking restrictions came out, people were worried about someone walking into the physical data center and grabbing a rack of tapes and walking off with it,” Chiu said. Those fears spurred the much higher frequency of encrypting of data at rest.

How do you steal those same assets in a virtual environment, where data encryption is often still an oversight?

“If you have administrative credentials, you pick the virtual machine you want, right click and copy it,” Chiu said. It’s not that hard to spot someone walking out of the building with a box of tapes. A virtual machine on a USB drive isn’t going to raise a single eyebrow.

Source

Like it? Share it! facebook 5 Overlooked Threats to Cloud Computingtwitter 5 Overlooked Threats to Cloud Computinggoogle plus 5 Overlooked Threats to Cloud Computingreddit 5 Overlooked Threats to Cloud Computingpinterest 5 Overlooked Threats to Cloud Computinglinkedin 5 Overlooked Threats to Cloud Computingmail 5 Overlooked Threats to Cloud Computingfacebook 5 Overlooked Threats to Cloud Computingtwitter 5 Overlooked Threats to Cloud Computinggoogle plus 5 Overlooked Threats to Cloud Computingreddit 5 Overlooked Threats to Cloud Computingpinterest 5 Overlooked Threats to Cloud Computinglinkedin 5 Overlooked Threats to Cloud Computingmail 5 Overlooked Threats to Cloud Computing

How to Get The Benefit of The Cloud

cloud authentication security 300x177 How to Get The Benefit of The CloudCloud computing is a boon, but its vectors need to be kept on a short leash, says Mushegh Hakhinian, security architect at IntraLinks.

The pace of business today requires that critical information be accessible anywhere, anytime, often among both internal and external parties. Hosted services are an important tool to enable this, but while the tools facilitate communication, they bring additional risk and challenges to the firms that use them. Technology is rapidly evolving to meet this challenge.

One example of this new technology is the use of cloud-based services. Despite the benefits they offer – an enhanced security network effect, lowered costs, easy implementation and on-demand capacity – there are some significant impediments. After moving to the cloud, it can be difficult for organisations to demonstrate compliance. Many firms, to quickly capitalise on lower costs, move into the cloud without checking the compliance or security ramifications. IT teams feel they may lose control over location or access rules for data and the first audit often shuts down the project and indefinitely suppresses the appetite to try a cloud-based service.

So, to enable your company to gain the benefits of cloud computing while you keep your sanity, it is imperative when considering a provider both to ensure compliance and to pass your security audits. At a minimum, a cloud services provider must be able to account for the location of your data at all times. Make sure it is protected from theft while in their custody, both at rest and in transit. Keep audit trails to report who accessed data and when. Demand flexible data classification and authorisation schemes to give you control over the setup and who can access which data. Additionally, your provider should offer options for ‘stronger-than-password’ authentication and protection of data ‘in-use’.

Basic security considerations

The foundation of every service is the infrastructure on which it runs. Saving money by selecting a cut-rate data-hosting facility is not a good strategy if the target customer base is security-savvy or regulated, such as financial services or life sciences. Hosting providers must have security certification, ie SAS 70 type II and ISO 27001.

Disaster recovery sites should be of equal quality and certification, so that switching between facilities will not put the service provider or the customer out of compliance. The vendor’s network design must provide a means to separate customer data or provide the architecture for database and application tiers to enforce data boundaries. A good design will also make applications independent from data location, so physical files can be stored in any geographic location.

If a provider hosts customer data, then they take the responsibility to ensure its confidentiality and integrity. There is always the potential for leaks to happen, whether by accident or by malice. Providers must ensure lost data is unusable by unauthorised users. Cryptography is key and should be implemented in a way that one insider will not be able to compromise data confidentiality. Strong encryption algorithms and multi-tier key management systems are a must. Protecting information in transit is also important and typically requires SSL 3.0 or higher. Ensuring data cannot be compromised in transit is worth it.

People who handle customer data can be targeted and there is the possibility of someone trying to steal data that has been stored. Comprehensive background checks can help to eliminate weak links. Maintaining detailed audit logs helps in identifying and prosecuting data thieves. It may even prevent the loss, as unusual activity can trigger investigation and put the brakes on stealth attempts.

It is important to keep the vulnerability surface of the system to a minimum. By using cryptography, it is possible to reduce the data that needs protection down to the size of the key.

Not all business information needs the same level of protection. Marketing collateral needs to be accessed by all, while financials have to be kept behind strong locks. By properly setting user roles, you can streamline complex authorisation schemes and make user administration easier and less error-prone.

The measures described above are the basic or ‘table-stakes’ security necessary to launch a public cloud-based service, but, unfortunately, some popular cloud providers lack even those and the jury is still out on whether they will ever be able to bolt on enough security. Properly implemented security features such as cryptography and granular authorisation schemes attached to highly segmented data help achieve premium security.

Advanced security considerations

Most online applications require only an ID and password for login. The reason for this is not purely technical: users of modern online business applications are increasingly ‘consumer-like’. Strong authentication introduces major inconveniences for users, the equivalent of putting the vault door at the entrance to a bank. That is why it makes sense to organise data within an online application, so high-level security is used to protect only sensitive data and actions.

There are two choices: predetermine parts of the application that require higher security, eg banking transactions or access to intellectual property; or, allow the data owner to specify which information needs extra protection. This second option, though more difficult, is preferable, especially for multi-tenant applications. It brings us closer to delegating security decisions to people who know the most about the data.

Once it is downloaded, information can be lost or stolen and you would not even know about it. In properly designed secure online systems, it can be more secure to share an electronic copy – there are technologies that allow tight access control and even remote shredding.

To address this, highly secure cloud systems implement in-use data protection, where content is always encrypted, even when it is sent to the user’s browser. Readers cannot open it without the key. When the owner removes access permission, the encryption key is destroyed, digitally shredding the remote document. Some have implemented a Diffie-Hellman key exchange, where the key is calculated each time before use.

The cloud enables a critical business requirement to be implemented – secure information exchange. For that to take place, the infrastructure must be designed with built-in security, from basic encryption to multi-factor authentication in use-protected content. IT can regain control over security by ensuring it understands cloud security principles – and requiring strict adherence to those principles by cloud vendors.

Source

Like it? Share it! facebook How to Get The Benefit of The Cloudtwitter How to Get The Benefit of The Cloudgoogle plus How to Get The Benefit of The Cloudreddit How to Get The Benefit of The Cloudpinterest How to Get The Benefit of The Cloudlinkedin How to Get The Benefit of The Cloudmail How to Get The Benefit of The Cloudfacebook How to Get The Benefit of The Cloudtwitter How to Get The Benefit of The Cloudgoogle plus How to Get The Benefit of The Cloudreddit How to Get The Benefit of The Cloudpinterest How to Get The Benefit of The Cloudlinkedin How to Get The Benefit of The Cloudmail How to Get The Benefit of The Cloud

SaaS Won’t Succeed With Some Apps

SaaS evolution 300x159 SaaS Wont Succeed With Some AppsGiven all the hype about the software-as-a-service model, you’d think that it could be applied to every category of software. Not so, says a new report from Forrester Research Inc.

In fact, SaaS will be “a disruptive force” in software categories that account for about a quarter of global software spending but will have “little or no effect” on many of 123 market segments studied, Forrester analysts Liz Herbert and Andrew Bartels wrote.

Forrester said that SaaS faces major obstacles in four broad software sectors:

• Lower-level elements of the stack, such as operating systems and databases.

• Software for internal IT management and data management.

• Entrenched process applications.

• Vertical applications, such as securities transaction processing systems.

Such systems account for 40% of all software spending, and Forrester’s report said they are likely to stay mostly in-house for “pretty obvious” reasons: security concerns, existing infrastructure investments, and the need to tightly integrate with other applications.

But SaaS is making inroads in mature application areas such as supply chain management, particularly among users who haven’t already purchased the same functionality in an on-premises product, according to the report.

Meanwhile, SaaS is starting to shake things up in areas like customer relationship management and human resources, where hosted offerings are replacing on-premises systems. SaaS is also moving into application development and the niche of governance, risk and compliance software, the analysts said.

The Forrester report said that SaaS is now the dominant model for software sales and delivery in areas such as e-purchasing, expense reporting tools, blogging and wikis.

Still, categories where SaaS has taken hold of at least 50% of revenue amount to only 3% of the total software market, Forrester said.

Source

Like it? Share it! facebook SaaS Wont Succeed With Some Appstwitter SaaS Wont Succeed With Some Appsgoogle plus SaaS Wont Succeed With Some Appsreddit SaaS Wont Succeed With Some Appspinterest SaaS Wont Succeed With Some Appslinkedin SaaS Wont Succeed With Some Appsmail SaaS Wont Succeed With Some Appsfacebook SaaS Wont Succeed With Some Appstwitter SaaS Wont Succeed With Some Appsgoogle plus SaaS Wont Succeed With Some Appsreddit SaaS Wont Succeed With Some Appspinterest SaaS Wont Succeed With Some Appslinkedin SaaS Wont Succeed With Some Appsmail SaaS Wont Succeed With Some Apps

Cloud, Virtualization Will Dramatically Change Security

cloud virtualization security 300x150 Cloud, Virtualization Will Dramatically Change SecurityWhen it comes to enterprise computing environments, the skies are getting increasingly cloudy–and dealing with that will mean covering up with flexible, dynamic security.

This was the message of Art Coviello, executive chairman of EMC’s RSA security division, during his keynote at the RSA Conference today, in San Francisco. Last year, he told the audience, his speech was about the promise of the cloud–the assertion that it’s possible to achieve security and do it better. This year, his keynote was “about the proof.”

“At this point, the IT industry believes in the potential of virtualization and cloud computing,” he said. “IT organizations are transforming their infrastructures. . . . But in any of these transformations, the goal is always the same for security–getting the right information to the right people over a trusted infrastructure in a system that can be governed and managed.”

To meet the demands of the cloud, virtualization security must accomplish three fundamental goals: be both logical and information-centric, become built into applications and infrastructure, and be risk-based and adaptive.

“In virtualized environments, static physical perimeters give way to dynamic logical boundaries defined by information and transactions themselves,” Coviello explained. “Logical boundaries form the new perimeters for trust, and virtual machines adapt security to their particular payload, carrying their policies and privileges with them as they travel across the cloud.”

Since information, virtual machines, and virtual networks can relocate in a blink of an eye, security measures in the cloud must be just as dynamic, he added.

“Achieving this means building security into virtualized components and, by extension, distributing security throughout the cloud,” he said. “Also, automation will be absolutely essential in enabling security and compliance to work at the speed and scale of the cloud. Policies, regulations, and best practices will be codified into security management systems and enforced automatically, reducing the need for intervention by IT staff–a problem that’s getting away from us today.”

On Monday, RSA announced the Cloud Trust Authority, a set of cloud-based services designed to facilitate secure and compliant relationships among organizations and cloud service providers. Within its inaugural set of capabilities is an Identity Service powered by VMware’s forthcoming Project Horizon. EMC also announced the new EMC Cloud Advisory Service with Cloud Optimizer.

Enterprises are facing tremendous change across information, identities, and infrastructure that is, in turn, creating challenges in control and visibility, Coviello said. Virtualization and the cloud have the power to change the evolution of security dramatically in the years to come, he added.

“Virtualization is the cloud’s silver lining because [it] fuels the cloud’s ability to surpass the level of control and visibility that physical IT delivers,” he said.

Source

Like it? Share it! facebook Cloud, Virtualization Will Dramatically Change Securitytwitter Cloud, Virtualization Will Dramatically Change Securitygoogle plus Cloud, Virtualization Will Dramatically Change Securityreddit Cloud, Virtualization Will Dramatically Change Securitypinterest Cloud, Virtualization Will Dramatically Change Securitylinkedin Cloud, Virtualization Will Dramatically Change Securitymail Cloud, Virtualization Will Dramatically Change Securityfacebook Cloud, Virtualization Will Dramatically Change Securitytwitter Cloud, Virtualization Will Dramatically Change Securitygoogle plus Cloud, Virtualization Will Dramatically Change Securityreddit Cloud, Virtualization Will Dramatically Change Securitypinterest Cloud, Virtualization Will Dramatically Change Securitylinkedin Cloud, Virtualization Will Dramatically Change Securitymail Cloud, Virtualization Will Dramatically Change Security