Cloud computing services are becoming the norm in business and the total size of the industry has more than tripled in the last five years. In 2008, it was estimated at a value of about $46 Billion, and by 2014 it is estimated to be worth more than $150 Billion. The following Cloud Computing Infographic illustrates the growth of cloud services:
The use of cloud computing is growing, and by 2016 this growth will increase to become the bulk of new IT spend, according to Gartner. 2016 will be a defining year for cloud as private cloud begins to give way to hybrid cloud, and nearly half of large enterprises will have hybrid cloud deployments by the end of 2017.
“Overall, there are very real trends toward cloud platforms, and also toward massively scalable processing. Virtualization, service orientation and the Internet have converged to sponsor a phenomenon that enables individuals and businesses to choose how they’ll acquire or deliver IT services, with reduced emphasis on the constraints of traditional software and hardware licensing models,” said Chris Howard, research vice president at Gartner. “Services delivered through the cloud will foster an economy based on delivery and consumption of everything from storage to computation to video to finance deduction management.”
“In India, cloud services revenue is projected to have a five-year projected compound annual growth rate (CAGR) of 33.2 percent from 2012 through 2017 across all segments of the cloud computing market. Segments such as software as a service (SaaS) and infrastructure as a service (IaaS) have even higher projected CAGR growth rates of 34.4 percent and 39.8 percent,” said Ed Anderson, research director at Gartner. “Cloud computing continues to grow at rates much higher than IT spending generally. Growth in cloud services is being driven by new IT computing scenarios being deployed using cloud models,
Companies that move applications and data to cloud, or perhaps build new systems on cloud-based platforms, often don’t consider the network infrastructure. When relying on systems that are connected via the network, the network is everything. Slow networks mean slow systems and poor performance.
Rule 2: Applications not optimized for cloud-based platforms rarely perform well
Many enterprise IT pros believe they can lift an application from a traditional on-premises platform and place it on a public cloud without a significant amount of redesign, and everything will end up fine. But how can applications not optimized for cloud-based platforms perform optimally on them? They can’t, so you get higher operational costs and substandard performance.
Rule 3: Consider the data
The manner in which the data is linked to the application is very important to cloud computing performance. Data that’s tightly coupled to the application may not be able to take advantage of many performance-enhancing features of public clouds, such as placing database processing in a series of elastic instances or using database as a service in the host public cloud. You should place the data in its own domain to provide alternatives for faster performance, as well as the opportunity to reduce costs.
Author: David Linthicum
DeCesare said business and high-tech companies will have to use a by-design strategy if they wish to remain ahead of the threats they face. He was speaking during a keynote at the McAfee Focus conference.
“We have to figure out how to integrate security into [networks] from the get-go. We have to redefine the role of network security. Companies are going to have to change. All companies will be rebuilding their networks,” he said.
DeCesare cited new trends resulting from developments in mobile cloud technologies, such as bring your own device (BYOD), as proof of the weakness of current networks.
“We are asking so much of our networks these days, not just with security, but in general. But, when we designed these networks five or 10 years ago we did not contemplate what we’d be asking of them today: to ingest the concept of a public or a private cloud, adjusting to the parameters of BYOD,” he said.
“We’re also asking them to be able to offer higher levels of security on any bit of information or device connecting to them and we’re balancing that with the concept of software-defined networking. What is happening in the network space is the same thing that was happening with the data centre space over the last 10 years – virtualisation is coming
From my experience working with academies, shared cloud services are the new de facto standard for federations or groups of schools working together, and this is being mirrored more widely across the UK education sector.
For schools wondering what level of cloud computing would work best for them, a good idea is to start with the outcomes and work backwards, as it is very much solution-led.
Our physical teaching spaces, preferred methodology and school vision for learning all go towards creating learning environments, each variant with specific IT needs.
In my experience, people rarely consider the impact of these things and how the IT is affected. For instance, schools with large open-plan learning spaces containing many students and teachers require a different ICT solution to those with classrooms built around the more traditional instructional model with the teacher leading at the front of a room.
What is important to consider is that IT should be an enabler, removing situations where an agile teacher who wants to use tablets is stuck in a classroom with a fixed computer and 30 PCs facing the front.
There are three key factors schools should consider when looking at cloud computing – cost, agility and the third millennium learning network.
With capital funding from the government decreasing, if schools are
Cloud computing offers businesses the opportunity to outsource their IT services, reducing both time and cost efficiently. This is highly attractive, as businesses do not need to understand the devices they are using, but do companies really understand the implications of outsourcing their business services to a cloud computing service?
Cloud computing is the next stage in the Internet’s evolution, providing the means through which everything can be delivered as a service, wherever and whenever you need. Businesses, such as Google and Amazon, already have most of their IT resources in the cloud so they can eliminate many of the complex constraints, including space, time, power, and cost. Yet, for all its advantages, cloud computing still makes some businesses a little uncomfortable, as it requires them to think about data in a different way, specifically regarding safety and the trustworthiness of third-parties handling the information.
Selecting your cloud computing provider
Trusting an outside company with something as important as data can be a difficult step for many businesses. But if you choose your hosting company carefully and undertake a formal due diligence exercise when choosing a supplier, then you will be entering into the contract with the right knowledge and expectations.
The most important thing to remember when you are working with your cloud computing provider is that you own your data and it still remains your responsibility to ensure it remains secure. The provider should manage the infrastructure and application availability, but they should
In my last post, I talked about the IT side of the “shadow IT” problem that cloud computing has created for companies all over the world. Cloud computing has enabled end-users to remove IT departments from the decision process when hiring solutions, especially software solutions, and also to keep these same IT departments out of the loop by using cloud-based applications that can be accessed through the browser, without any kind of installation or IT oversight.
This can cause numerous technology-related problems, from users running different applications that accomplish the same purpose (multiple CRM suites, for instance) all the way to a company ending up with completely incompatible software, so that data can’t be exchanged between applications, making systems integration impossible to achieve. Furthermore, most end-users don’t take the same precautions as IT when hiring solutions, so they often forget to perform the necessary due diligence on companies; they might hire providers that lack essential elements, such as proper SLA guarantees.
Technology, however, is only the first half of this issue. The emergence of shadow IT also has serious implications for the financials of companies that, if ignored, can generate big long-term issues.
Cloud and the pay-as-you-go model
The pay-as-you-go pricing model is perhaps one of the greatest benefits that cloud computing has brought to the IT market. While it already existed before the advent of the cloud, it was cloud providers who truly made this business model popular amongst technology sellers and buyers alike.
Virtualization has become the key technology underpinning ‘cloud-era’ IT infrastructure because the ability to deploy virtual instances of servers, desktop PCs, storage devices and network resources allows existing hardware to be utilised efficiently, and makes for much better manageability, flexibility and scalability.
The graphs below highlight adoption trends across various implementations and applications:
The hidden costs were explored in a new survey of 468 CIOs, conducted by Research In Action and underwritten by Compuware Corporation. There’s no question a lot of money is going into cloud projects — two-thirds of the respondents say that cloud is their top investment area for 2013. Many have crunched the numbers for the up-front costs of cloud computing, which typically include subscription fees, combined with some level of staff time required for set-up, user training and integration.
The study found that the majority of CIOs (79 percent) actually think a lot about potential hidden costs, and what they may mean to the business.
From a management perspective, the top cloud computing concerns are:
- Poor end user experience due to performance bottlenecks (64 percent). This goes right to the customer end-user experience as well, since e-commerce is the leading cloud application area, the survey finds – 78 percent of respondents are already using cloud resources to support e-commerce.
- The impact of poor performance on brand perception and customer loyalty (51 percent).
- Loss of revenue due to poor availability, performance, or troubleshooting cloud services (44 percent).
- Increased costs of resolving problems in a more complex environment (35 percent).
- Increased effort required to manage vendors and service level agreements (23 percent).
Of course, these costs are difficult to quantify and measure. An outright outage or system failure is easy to quantify, and can be relatively simple to address. But a slowdown or partial glitch somewhere in the
The World Wide Web has come a long way since 400 Arpanet users received the first spam message in May 1978. Journey back to 1969, when the U.S. military-funded research network Arpanet connected four computers, to take a look at the Internet, then and now.
Renamed “the Internet” in 1984, the service reached a milestone moment when it linked 1,000 hosts at university and corporate labs. Almost 15 years later, the Web began making commercial strides, reaching 50 million users in 1998, and later topping 1 billion in 2009.
Today, more than 2.7 billion people — 47 percent of the world’s population — use the Internet in 750 million households across the globe.