Is desktop virtualization ready for prime time? The data illustrated in this infographic, based on data from the Citrix Global Security Index and Forrester Research, indicates that, indeed, desktop virtualization’s time may well have arrived.
The demand for desktop virtualization is being driven, at least in part, by the explosive growth in mobile work styles. That’s a key takeaway from a new global market study commissioned by virtual solutions provider Citrix, whose virtualization, networking and cloud solutions are delivered to more than 100 million corporate desktops daily.
The survey found that 55 percent of responding companies will deploy new desktop virtualization for the first time by 2013. Of those surveyed, 86 percent said security was the biggest reason, and that desktop virtualization is a strategic choice for improving security in an age of multiple devices.
Security Joins Savings
The white paper describing the study noted that “familiar advantages of desktop virtualization include the ability to enable a more flexible workplace,” provide support for mobile workers, and effectively manage the variety of devices typically found in an organization. It noted that security now joins savings as a reason in favor of desktop virtualization.
The kinds of security that are driving desktop virtualization include the need for secure access for mobile and user-owned devices, increased security requirements for apps and data, the desire to be able to accommodate an increasingly mobile workforce, and simplified risk management.
Citrix Chief Security Strategist Kurt Roemer said in a statement accompanying the survey that desktop virtualization offers centralized control and management of software devices, as well as “granular, policy-based access control and support for compliance requirements.” With its infrastructure level of information governance, he said, it enhances risk management.
Provisioning, Isolation, Wiping
Other benefits found useful by IT managers in the study include the ability for immediate provisioning of security updates, apps and access, which was identified as a key benefit by 60 percent of respondents. Some 54 percent believed that the instant isolation of a compromised application was a key benefit of desktop virtualization, while 32 percent identified the ability to remotely wipe data from devices.
The survey also found that virtually all respondents — 95 percent of those surveyed — believed that virtualization was effective in protecting information, while still allowing employees the ability to get the information they needed to do their jobs.
For device-related issues, nearly three-quarters of the surveyed IT decision makers see desktop virtualization as a way to immediately update an entire fleet of computers and devices, and 66 percent felt that the ability to deploy applications securely was a critical component in their decision to implement the technology.
The survey was conducted by independent market research firm Vanson Bourne, under a commission from Citrix. The survey covered 1,100 senior IT managers in October in 11 countries, including the U.S., U.K., The Netherlands, Germany and Canada. Three-quarters of those surveyed worked at enterprises of more than 1,000 employees, while the rest were in companies of 500 to 999 employees.
Author: Barry Levine
Video showing AutoCad 2011, VDI using XenDesktop over Internet connection.
Back-end uses a FX1800 and the drawing is what the GPU can handle.
- When “desktop virtualization” is used to describe making it possible for people to access a physical or virtual system remotely, access virtualization technology is used to capture the user interface portion of an application. It is then converted to a neutral format and projected across the network to a device that can display the user interface and allow the user to enter and access information. This means that just about any type of network-enabled device could be used to access the application. Suppliers such as Citrix, Microsoft, and VMware offer client software for tablets, smartphones, laptops, and PC, making it possible for users of those devices to access the applications running elsewhere on the network.
- When “desktop virtualization” is used to describe encapsulating an application using client-side application virtualization technology and then projecting it in whole or piecemeal to a remote system for execution, the application could either remain on that client device or be deleted once the user completes the task, depending on the settings used by the IT administrator. This means, of course, that the client system has to run the operating system needed by the application. So, Windows applications, for example, would need to run on Windows executing on a PC or laptop.
- When “desktop virtualization” is used to describe encapsulating the entire stack of software that runs on a client system, the phase starts to take on a great deal of complexity. That encapsulated virtual client system becomes highly mobile. Here are the possibilities:
- One or more virtual client systems could execute on a single physical client system. This allows personal applications to run side by side with locked-down corporate applications.
- Local execution. Virtual client systems could run on a local blade server. The user interface is projected to physical PCs, laptops, or thin client systems using access virtualization technology.
- Remote execution. Virtual client systems could run on a server that resides in the organization’s data center. The user interface is projected to physical PCs, laptops, or thin client systems using access virtualization technology. Since the industry is using the same phrase to describe all of these different approaches, the concept of desktop virtualization can be quite confusing to those unfamiliar with all of the different types of technology that could be pressed into service.
Author: Dan Kusnetzky
It looks like the end for the PC. Unlike the Dinosaurs the PC will not just vanish from the face of the Earth – they will just stop being replaced and gradually die out to be spotted on the desks of die-hard antiquarians and IT folk who like to tinker with the insides of anything that can be opened.
I can even date when this process began. It was the day, the 18th August to be precise, when HP announced it had bought UK software firm Autonomy for £7.1bn then added, almost as an afterthought that it was considering selling its personal systems group, which includes the world’s biggest PC-making business, and that it will discontinue its webOS devices.
HP has some very complex problems to solve and they have gone through some bad times but this decision is about more than surviving bad times and facing the commoditisation of the PC squarely in the face.
It’s about realising that the major choice a consumer or a corporate has now is not about hardware. The days when buying the correct IT infrastructure are now long gone – today it’s about recognising the realities of mobile computing.
Mobile computing isn’t about providing your staff with the best laptop, smartphone and tablet to keep them productive on the road – the hardware is now so reliable that the choice is often left to the users – something that would have been unthinkable a decade ago.
How on Earth would they be able to choose the most reliable laptop? Now a consumer can walk into any retail park and buy a tablet computer with no moving parts that will last until it is lost or discarded.
Many young people are doing much of their computing on their smartphones where the choice of hardware is, once again, almost irrelevant. Young people live their lives on the move from home to school to University to work and they want devices they don’t have to pack into a car to lug around.
They also use the Cloud instinctively – what is iTunes but a Cloud providing music. The popularity of web mail means few youngsters have ever configured a Microsoft Outlook client.
The only thing stopping most businesses moving to tablets is purely to do with their culture and environments. We are heading for devices that will run almost exclusively over the internet. I am writing this piece on a tablet connected to the internet using my virtual desktop. I don’t even know where my old laptop is now.
Most people are now happy to use email banking and other services via browser and while it took some time for people to become comfortable with banking online they have taken to it eventually as they discover it is safe. What we are seeing is the rapid expansion of available services with more traditional pcs being replaced by the tablet form and laptops taking over the arena of the PC.
Most users will be happy with a screen and internet access, and the ability to buy or use applications that provide the services they need, without the need for bulky operating systems.
The benefits of Cloud Computing for businesses are now so compelling that any new business setting up would be very unlikely to provide its employees the internal IT infrastructure that was typical only a few years ago.
Why force upon yourself the overheads of PCs which have a cost of ownership, need repairs often because of the number of moving parts and a full-time staff to keep them up to date and running? No business person would do it unless they had compelling reasons.
The Cloud is greener. Cloud data centres use virtualisation which cuts down the number of servers needed and the power consumption goes down because new servers don’t even have fans inside them. HP is merely following example set by IBM when it dropped out of the hardware market to concentrate on services.
It will be interesting to see how Microsoft and the other PC-centric vendors adapt to the new world of the Cloud. For the moment the consumer (user) is in control, something the old PC departments of corporates fought fiercely against. But like the Dinosaurs they lost the battle.
Many small-business owners may not realize that the cloud plays a big role in their business operations, and its importance is growing every day. I’m often asked, “What exactly is the cloud, and why does my company need it?” Simply put, the cloud hosts resources and applications that are accessed through the Internet, and it now offers small businesses access to powerful capabilities that once were only within reach of larger corporations.
From a small-business perspective, the cloud provides the opportunity to leverage outsourced hardware and scalable infrastructure, rather than requiring small businesses to make large in-house IT investments. Without the cloud, hardware and software purchases could significantly eat into profitability — as would the management and maintenance required to keep the technology running — not to mention the additional real estate investment required to house the hardware on-site. By investing in a cloud service, small businesses can manage technology in a cost-effective way while staying technologically competitive.
Always On, Always There
The cloud is increasing the effectiveness of small businesses in a number of ways. For example, companies are calling on efficient Web-based services and applications to manage such critical tasks as accounting, customer relationship management, document creation and communication.
One reason the cloud has worked so well for small business is that it’s always just a click away. The cloud is on-demand, available to employees anywhere that they can connect to the Internet. The cloud also provides extremely cost-effective storage, allowing businesses to easily store as much data as they need without purchasing any hardware. That’s why one of the most efficient ways for a business to use the cloud is to back up and store its critical documents.
Cloud-based files are always available to business owners. Some small businesses are backing up their entire databases the old-fashioned way, with external hard drives, USB drives, or even CDs. To a point, these manually driven external methods will get the job done, but relying on physical devices as the sole backup strategy still leaves a business vulnerable to data loss.
Here are several ways that cloud storage can simplify a company’s data-protection process:
Automatic Backup: Most small-business owners are faced with an exhausting work schedule, often finishing a significant business project at the end of a week and simply saving it to their computer’s hard drive. Once in a while, Monday morning arrives and they discover their computer won’t boot up, or it crashes while checking a weekend’s worth of emails. Now they’ve lost their business data and deliverables, just because they didn’t want to go through the hassle of manually backing it up before heading home for the weekend.
Unlike an external hard drive, USB drive or CD, a cloud-based service will automatically keep an up-to-date copy of each employee’s data securely backed up and readily accessible off-site. This automated service enables small business owners to focus on running a business, rather than running backup.
Secure and Offsite: A big concern for new cloud-users is data security. Many cloud backup or storage companies encrypt every piece of data — just as banks do — before uploading to, or downloading from, the cloud.
Another perk of the cloud is protection from physical loss of data backed up to devices that are stored on-site. If an on-premises disaster like a fire, flood or theft strikes, small businesses that backup to the cloud can rest assured that their data is protected in a secure, offsite location.
Easy to Access and Restore: With remote access capabilities, never again will business owners leave the office only to realize while pulling into the driveway that they left an important file sitting on their computer desktop. Since their files already have been backed up automatically, they have the luxury of securely logging into their account from a Smartphone or any other Internet-enabled device to access any file that they need.
In the event of a hard drive crash, cloud backup users can easily restore backed-up files to a new computer. This functionality enables small-business owners to restore the files they need immediately to keep their business up and running after a crash. They can then conduct a complete restore once the new computer is acquired.
Competitive Pricing: A number of quality options are available in the market, but one of the biggest deciding factors for small businesses to purchase cloud-based backup services is the price. Owners like to know what’s coming so they don’t have to deal with an unexpectedly high bill for an influx of use they didn’t anticipate.
Mozy, for example, provides pay-as-you-go backup services as a standard price per gigabyte (GB) used paired with monthly license fees for each computer attached to the account. DropBox offers flat monthly rates for the entire business at capped GB amounts. Carbonite offers tiered usage and pricing structures with no per computer licensing fees. Other major players such as SOS Online Backup and Backblaze also offer small business-specific services with competitive rates.
Data storage is extremely important to every small business, but it shouldn’t be a stressful process. The cloud offers a simple yet invaluable service that can be the difference between disaster and disaster recovery.
A number of companies offer cloud solutions. It is important to note that, when researching these products, most of which offer some form of a free trial, ensure that they have encryption technology, have earned a positive track record over an extended period of time and offer 24/7, individualized customer service.
Interphase has launched a product portfolio of six desktop virtualization thin clients, small devices that do not store information but can create a desktop environment for users by relying on a remote host server. Companies are increasingly using cloud computing services to deliver their software from remote servers, rather than on-site servers managed by large information technology staffs.
Analysts say challenges abound for the company’s clouDevice portfolio, an untested batch of products in markets dominated by larger, more experienced players, such as Wyse Technology and Hewlett-Packard Co.
“It’s a bit of a David-and-Goliath challenge, but we’re going after it,” Interphase chief executive Greg Kalush said. Kalush said that despite the challenges, his company may have come up with a way to build market share relatively quickly.
How’s that? By keeping the clouDevice’s overhead and manufacturing costs low, and by specializing hardware design to match client needs, Interphase may be able to undercut its competition with lower prices, said HJ Li, Interphase’s senior director of product management.
Analysts say that while small tech companies will fare well when the private cloud computing market takes off, they are skeptical that undercutting the competition will bring Interphase the business it seeks.
“With Interphase being a relative unknown in the thin client space, cost [savings] can only go so far,” said Ian Song, a senior research analyst at the International Data Corp., an information technology research company.
“We really don’t know what some of their performance capabilities are,” Song said. “The portfolio is by no means comprehensive. It’s pretty good.” Mark Margevicius, vice president and research director at Gartner Research, a company that tracks trends in information technology, said that as long as a company differentiates itself, there’s an opportunity for any vendor to succeed. But the technology Interphase must use to manufacture its devices is available to Wyse and HP, so it’s hard to see how Interphase can undercut its competition at all, Margevicius said.
Potential clients have been more than a little rattled by the recession. Kalush said some of the most appropriate customers for the clouDevice products are cash-strapped schools and local governments — particularly in Texas — that could use information technology to save money. For education, Interphase emphasizes cost-efficiency and more computer access for students.
“We could put a wireless client device in the hands of a student for $150 or less,” Kalush said.
For governments and enterprise businesses, desktop virtualization consolidates a company network to one source, making it easier for IT staffs to thwart computer viruses that would otherwise spread undetected, Kalush said.
Expanding the business Interphase, though small, had a strong first half of this year. The company increased its telecommunications revenue 60 percent in the second quarter, generating $6.2 million. Profit rose 82 percent in the second quarter over last year to $3.2 million.
The company was founded in 1974 as an engineering consulting company, and it gradually started selling products in the early micro-controller industry. Micro-controllers are tiny computers built into products, such as microwaves, that usually perform just one function.
In 1999, Interphase entered the telecommunications market, selling hardware and signaling devices to military units and other clients. But three years ago, Interphase leaders decided the company needed to diversify, partially because the telecommunications market is unpredictable.
Li said he started looking into the information technology market, but at first, he had trouble finding a market that a company of Interphase’s size could enter effectively. The cloud computing market was an attractive bet, and analysts predicted plenty of growth, Li said.
Gartner Research expects worldwide cloud services revenue to grow about 100 percent to $148.8 billion by 2014. According to IDC, server sales resulting from cloud computing will grow 50 percent to $12.6 billion by 2014.
Ethernet advantage Bill Rust, a Gartner analyst who covers education information technology, said one of the clouDevice’s features, the ability to run using an Ethernet rather than a USB cable, makes Interphase an interesting vendor.
USB is limited to 16 feet and is a major problem for implementing desktop virtualization in schools, but Ethernet doesn’t have that limitation and is cheaper.
“They may be one of our next Cool Vendors,” Rust said, referring to annual recognitions that Gartner gives to innovating companies.
Wi-Fi on the way The Ethernet clouDevice products are already available, and in the coming months, Interphase will release a version of its devices that can run on WiFi.
Susan Eustis, president of WinterGreen Research, said small companies that can be nimble and innovative will do well in the cloud computing market because large companies can’t cut prices and aren’t focused on building market share.
“Small companies aren’t hampered in this way,” Eustis said.
“After the little companies build the market, the big companies will gobble them up. They see who’s going to succeed.” However, small companies must be careful to develop products that are unique — they should avoid going head-to-head with large companies such as IBM, she said.
“That being said, there’s a lot of space in this cloud business to go places,” Eustis said.
With so much growth expected for private cloud computing services, Kalush said, it was hard to resist jumping in.
The market has “got a lot of positive attributes, and we thought, what the heck, this looks like a pretty cool market, let’s go see what we can do,” he said. “It’s one of those Hail Mary passes, but we feel pretty good that we’ve got something here.” Interphase Corp.
According to a recent report on Cloud trends, analyst house Gartner recommends that network managers should test or emulate the performance of Cloud -based applications in all geographies where they plan to deploy. There is a need for organisations to adopt a Cloud computing service that addresses specific issues around latency such as distance, congestion and monitoring.
Despite all the hype around the move to the Cloud over the past few years, latency remains the one issue that many organisations are still failing to get to grips with. This is somewhat surprising, as file access and application performance – or in short, how quickly information is delivered to the end user or how long it takes to download or open a file – is often high on an end user’s priority list. High latency has a range of causes. A common one is the physical distance between the office and the data centre. Others include congestion on the network as well as packet loss and windowing. While many organisations have explored the use of various application performance tools that are currently available on the market, it is often the physical issue of distance that needs addressing.
For example, many organisations face the challenge of locating their office in a different country from their data centre. So, if an organisation has an office in the UK and a data centre in Spain, for some applications, the distance between the two, can lead to slow file transfers and inefficient application performance. Selecting a data centre location in the same region as the offices, can be critical for some applications simply to address the issue of slow file downloads and poor application performance. Clearly, organisations should look to service providers that can supply multiple data centres across Europe to help with these needs.
When it comes to tackling high latency, it is not just the physical passage of light that causes issues but also the way the signals are sent through networks. The key issues that conspire to create high latency are congestion, packet loss and windowing. Congestion affects latency because when the amount of data to be sent down any connection is too large the network node equipment holds some of it back in a queue while it sends the data that it received first. Windowing is caused by the way signals check that they have been correctly received, the receiver has to send a message that it has the first packet before the sender will send another, this means more delay and more congestion for every packet and if packets are lost then this gets worse because replacement ones have to be sent. This queuing, confirming receipt and resending obviously slows the speed of travel from A to B.
This can often become the source of poor user experience following the move of traditional desktop applications into a Cloud environment because the network connection to the Cloud service does not prioritise traffic appropriately. Often these applications, such as SharePoint, are critical to the day-to-day running of a business and users do not tolerate ten to twenty second performance delays each time they open, save or close a document. They are, however, experiencing these delays because of congestion caused by applications like Exchange which users are perhaps prepared to tolerate slightly slower response times from.
If you combine the competition for bandwidth between business applications with the increase in online recreational video streaming in the workplace, for example YouTube and BBC iPlayer, the challenge for IT teams managing application performance to end users becomes much harder. Traditional methods of WAN acceleration or network Quality of Service are not well suited to companies trying to prioritise Cloud-based services from public providers. As a result, it is becoming more and more important for enterprises to consider Cloud service providers who can offer application performance guarantees as part of their service and providers who can be measured and backed up by stringent service level agreements (SLAs), when entering into contracts.
Once the issues of distance and latency have been addressed, the service provider monitoring of end user activity in the Cloud can help minimise bottlenecks or capacity shortages. For the IT team managing the performance and capacity of Cloud provided services, more sophisticated monitoring tools can provide them with continuous analysis of service performance to ensure the delivery to their business of the promised benefits of Cloud .
The good news is that many businesses are starting to make strides to overcome distance, congestion and the need for more granular monitoring. At the end of the day, the full benefits of Cloud are through having a flexible, elastic computing environment that guarantees security as well as performance. This is now being achieved by companies using service providers to supply a full enterprise quality Cloud infrastructure service, with performance measures that include the network.
Virtualization makes it possible to bring corporate desktops and applications to any mobile device that suits an employee’s fancy. This flies in the face of vendors that are positioning their tablet PCs as more “enterprise ready” than Apple’s iPad, the predominant device in desktop virtualization deployments.
HP, RIM and Cisco vociferously insist that their tablets are more than just iPad clones and come with much stronger security and management capabilities. Which is understandable since these tablet vendors are frantically exploring ways to differentiate themselves — and to scare organizations away from buying iPads.
But some virtualization experts feel the whole “enterprise ready” argument is mostly just marketing claptrap. Not only is the iPad suitable for enterprise desktop virtualization, with the exception of a smattering of Android tablets, it’s far and away the device of choice for business deployments.
“An ‘enterprise’ tablet doesn’t matter as much in a virtualized environment as it does in a traditional environment, where you’re running applications on the device,” said Richard Brumpton, director of the national virtualization practice at MTM Technologies, a Stamford, Conn.-based solution provider.
Enterprise desktop virtualization provides a consistent context and supportable environment that’s both secure and easily managed by IT, which leads in turn to lower support costs, according to Brumpton. “In a virtual environment where you’re keeping everything in the data center, it doesn’t matter what the security requirements are, beyond a certain set of requirements,” he said.
HP, which last week instituted a permanent $100 price cut on the HP TouchPad, is in the process of training its channel partners how to position the tablet in an increasingly crowded marketplace. It’s unclear if downplaying the iPad’s business suitability is part of this training, but this view is prevalent among HP partners.
“Apple often touts the number of apps they have on the App Store. But the bottom line is that CIOs don’t want employees using all kinds of apps with data all over the place,” said Paul Shiff, vice president at Hub Technical Services, a South Easton, Mass.-based solution provider. “CIOs want centralized management and monitoring, and you can’t get that from any other device but the HP TouchPad.”
HP was late to the tablet market but often cites its channel as a competitive advantage over Apple, and Shiff is out on the front lines working to slow the iPad’s progress. He’s currently holding meetings with education customers that have recently bought iPads and trying to get them to reconsider and buy HP TouchPads instead.
“We have a school customer that bought 1,000 iPads, and they’re now considering their next purchases. Maybe they will think differently next time,” Shiff said.
HP certainly has the channel numbers to exert an influence on IT buying decisions, but the iPad has shown itself to be so suitable for enterprise desktop virtualization projects — Apple last month said 86 percent of Fortune 500 companies are deploying or testing it — that it’ll probably take quite a while for alternatives to emerge.
MTM’s Brumpton believes that the iPad’s limited computing focus has actually been one big key to its success in virtual environments. “The iPad, for all its mystique, has a very narrowed-down focus, but Android is all over the place — it tries to be everything but doesn’t do anything well. I think we’re seeing tablets like the iPad emerging as something like a cloud client,” said Brumpton.
Enterprise desktop virtualization lets IT support back-end infrastructure and allow their users to choose whatever device they want. In light of this, HP and other tablet vendors aren’t just competing against Apple for market share — they’re trying to overcome the will of legions of corporate workers who’ve been given their choice of device and have selected the iPad.
“There so many people buying iPads right now that even if organizations were to hand their employees another brand of tablet, they wouldn’t stop using iPads. We’re just too far down this path,” said Mike Strohl, president of Entisys, a Concord, Calif.-based virtualization VAR.
Over-provisioning is a nice way of saying you’re throwing money away. That could happen in a variety of forms, such as buying infrastructure that it better suited for a much larger company, planning for growth that doesn’t happen, or not doing your homework on what other technology you’ll need to support virtualization. But fear of wasteful spending shouldn’t stop you in your virtual tracks; rather, it should motivate informed, careful decisions.
Raj Dhingra, CEO of NComputing, believes 2011 is a turning point in desktop virtualization deployments among small and midsize businesses. Dhingra, who left Citrix to take the NComputing helm in April, also said the broader field of virtualization vendors has taken note: “Everybody sees there is a big opportunity there.”
As the number of viable virtual desktop infrastructure (VDI) options for SMBs increase, Dhingra recommends paying close attention to four key areas when making a decision. Doing so can help minimize the over-provisioning risk and ensure a real return on the investment.
1. Look for platforms specifically designed for SMBs. While a vendor’s ability to scale with the growth of your company is important, don’t let your daydreams overshadow your actual needs–starting small can provide a bigger ROI in a shorter period time.
“Buy the shoe that fits rather than buying the shoe that’s two sizes bigger in hopes that you’re going to fit into it over time,” Dhingra said.
The most obvious place to look is the cost per seat: This often tops the $1,000 mark in enterprise platforms, which makes the total cost of ownership (TCO) and return on investment (ROI) case trickier for SMBs. “If it’s now costing you more than a PC, that’s your first red flag,” Dhingra said. He added that TCO/ROI analysis for a 100-seat deployment is not the same thing as a 100-seat proof of concept–with an expectation that several thousand seats will be added later.
It should be noted that for some SMBs, ROI isn’t just a matter of comparing virtual desktop versus traditional PC costs. At Infinity Sales Group, for example, both desktop support and power costs were major factors. For Silicon Valley Builders Group, mobility was the critical payoff in going virtual. In fact, the firm’s CIO noted in an interview that just comparing per-seat costs can be a dead-end: “It would be a hard sell. Virtualization is still something like $1,200 per user, versus a PC I can go buy at Fry’s for $500,” he said.
No matter your particular business case, cost-per-seat is obviously still important. The moral: Don’t pay for seats you don’t need.
2. Know your supporting infrastructure needs. Desktop virtualization doesn’t mean you’re leaving hardware behind. Make sure you have a complete understanding of the supporting pieces you need, both on the server or host side and the client side. For the former, this includes things like servers, storage, and networking equipment. On the client side, don’t forget to account for the actual devices–such as thin clients, for example–as well as your software needs.
Dhingra said not taking all the necessary components of VDI into account is a key budget pitfall for SMBs, particularly if the initial investment is based on an expectation of significant growth. It can also lead an organization to an infrastructure it’s ill equipped to manage.
“That means not only the capital to actually procure [VDI], but then do I have internal expertise within my company to actually deal with this and work with it?” Dhingra said.
3. How many vendors are you willing to work with? Another possible sign you’re headed down a path of over-provisioning: If your desktop virtualization project requires one or more multi-vendor components. This is likely a bigger issue for the “S” in SMB. While a midmarket firm with, say, 750 employees has more resources to manage multi-vendor platforms, a 50-person company might not want the potential headaches. More importantly, it might not have enough IT resources to do so. “It becomes a systems integration project that is typically suited to a larger company,” Dhingra said.
4. How soon until you’re up and running? You can’t really start the ROI meter until your deployment is complete, right? For budget-constrained SMBs, a multi-month (or even year-plus) VDI project adds hidden costs–another form of over-provisioning–that can immediately dull the shine of potential savings. Moreover, smaller companies usually thrive on their speed and agility–IT projects should be no different. Dhingra said IT pros at SMBs should factor training and skills developments here, too: If you lose two days at an off-site training, for example, that’s an expense–even if the event is “free.”