Virtualisation becomes a channel reality
Virtualisation may mean different things to different people, be it slicing up servers into virtual machines or bringing utility computing to fruition. Either way, it could turn out to be a goldmine for the channel, reveals Simon Meredith
To Richard Garsthagen, tech-nical marketing manager for EMEA at VMWare, ‘virtualisation’ means “full hardware computerised virtualisation”.
In simpler terms this describes what his company’s software does. That is to slice one or more servers up into ‘virtual machines’ so that, instead of deploying scores of servers, the customer needs only a handful.
What happens here is that multiple instances of the operating system are run simultaneously on the same physical machine. There are other types of virtualisation, but the VMWare form is the one that has become most accepted.
Storage virtualisation (effectively the use of SANs) is already established, and software-based server virtualisation (the creation of virtual machines) has also now become a standard part of data-centre consolidation vocabulary.
“Virtualisation today means different things to different people,” according to Paul Dunford, director of channels EMEA at ClearCube Technology, which specialises in centralised computing systems.
“One thing in common, though, is that companies are looking for virtualisation to reduce costs and simplify infrastructure.”
The argument for virtualisation is powerful. Analyst Forrester Research, in its paper, Desktop Virtualisation Is The Future Of The Corporate PC, states that capacity utilisation generally hovers between eight and 15 per cent on Windows/Intel servers, and between 25 and 35 per cent on Unix and Linux servers.
According to Forrester’s Business Technographics research from last year, 29 per cent of firms were already using server virtualisation, and almost two-thirds of these were planning to increase their spending in that field in 2006.
Analyst IDC has produced simil-arly positive forecasts. It has forecast that the number of “virtualised system images” (or virtual machines), is expected to grow from 778,000 in 2004 to 5.1 million in 2009 (a compound annual growth rate of 45.7 per cent).
IDC also stated that in companies where virtualisation is already in use, about one-fifth of installed servers are virtualised. It expects this proportion to have risen to 50 per cent by the end of 2006.
IDC’s work suggests that uptake of server virtualisation is stronger in larger companies – especially in businesses with more than 500 employees. However, all sizes of organisation are experimenting, said Martin Hingley, vice-president of the European systems group at the research firm, but it is a trend we should be a little wary of as well.
“About a quarter of all x86 servers bought worldwide in 2006 will host virtual machines,” he said. “It is an opportunity for users to get more from their servers. It has become such a strong trend that IT managers need a really good excuse not to consider it.”
However, there are also threats to the industry, Hingley noted. “For instance, if we improve the efficiency of server usage significantly we may sell less hardware. Also, the strong move towards virtualisation will slow down innovation in the underlying operating systems. It will mean we standardise on these and innovate in the virtualisation part of the stack.”
This effect is only likely to be felt in the long-term though. For now, there is plenty of room for growth and still time for resellers to get into this sector of the market – mainly with VMWare, according to Rupert Green, a virtualisation specialist at corporate reseller Logicalis.
“Without a doubt, VMWare is the most significant player in this space at the moment, yet it currently sits on less than six per cent of Intel servers,” he said.
“This means that there is still a huge opportunity both for VMWare and firms such as Virtual Iron and Virtuozzo, and their channel partners.”
In a sense, the virtualisation trend is reverting back to the old mainframe model, according to Chris Gomersall, general manager EMEA at PolyServe, another one of the key players in software-based virtualisation. We should expect the virtualised model to deve-lop over the next few years, he added.
“We will see a return to larger machines effectively partitioned to run multiple applications,” Gomersall said. “Let’s not forget that the first virtual machine environment was the IBM mainframe, which was partitioned into virtual machines 20 years ago. The same techniques were applied to Unix machines. And the same thing will happen with large multi-processor Intel servers.”
Gomersall added that ultimately, the underpinning technology for any enterprise-class virtualisation is a shared data model that allows multiple machines (virtual or physical) to share data which itself is virtualised across the storage environment.
This could make virtualisation a veritable goldmine for resellers, Gomersall believes, because bigger, faster systems will be needed to accommodate all of the virtual machines.
“You can’t take a bunch of existing servers and polish them up,” he said. “The big new machines are already arriving. Some have even adopted a very different architecture to make them more suited to virtualised environments.”
For example, Egenera’s BladeFrame, uses a processing area network (PAN) architecture to deliver a ready-made data-centre virtualisation in the box. Diskless, ‘stateless’ blade servers are combined with a high-speed switch fabric and built-in virtualisation software to provide dynamic allocation of servers to applications as required.
According to Niels Groenlund, European server business development manager at Fujitsu Siemens Computers (FSC), which markets the BladeFrame in Europe, this is particularly appealing to organisations in the finance and telecommunication sector that have previously employed banks of Risc-based servers to cope with large peaks in demand for processing power.
He said: “If an application needs power, you can increase the capacity, so it is very useful. You can do that today only with a limited number of systems, because the operating system does not support that functionality.”
Where it can be deployed, the BladeFrame can reduce the server count and associated administration costs and complexity by as much as 80 per cent, the firm has claimed. There are plenty of examples on the company’s web site.
Essentially, the BladeFrame is a very advanced virtualisation appliance. Increasingly, this is the way the technology is moving. Steve Clayton, head of mid-market technology at Microsoft, said it is something that customers increas-ingly expect to be part of the data centre proposition now.
“Almost 100 per cent of customers I meet are either already using a virtualisation solution, or are seriously considering doing so,” he said.
“It has become technology that customers are not expecting to pay for, hence the decision to make our Virtual Server product free, and to ‘bake’ it into our next server operating system release.”
He added that virtualisation solutions potentially increase server deployments, which makes it easier to add another server to the estate and get it up and running quickly.
“Previously a customer may have thought about trying to add that application onto an already existing ser-ver,” he said.
As well as becoming an integral part of the operating environment, virtualisation will also be much more closely integrated with applications, Garsthagen claimed.
“Virtualisation is a completely destructive technology,” he said. “It allows you to rethink entirely how you perform standard IT tasks and do them better and more effectively.
“Another key area is how people use applications. Software companies are now developing systems that are effectively virtual appliances.”
These virtualised editions, will come ready to run and packaged with the operating system, as virtual machines will become the norm, Garsthagen said.
There are already more than 300 packages available that use VMWare’s technology, most of them database, development and security applications, as well as various Linux flavours. Oracle is the most notable partner here; Microsoft applications are not deployed in this way at present.
But this is going to be just another staging post on the journey towards a bigger concept of ‘utility computing’, according to Michael Hjalsted, marketing director for servers, systems and technology at Unisys.
He said: “Virtualisation is another step on the journey towards providing the IT infrastructure as a pool of resources, where IT capacity will be made available to clients based on the actual need at any given time. Solutions will be moved around to best fit the actual capacity requirement and clients will be billed based on the actual usage.”
The main opportunities for resellers will be wide ranging, said David Day, vice-president of engineering at applications traffic management firm Zeus Technology.
“The benefits for the channel and resellers comes from the opportunity for value-add deployment services, the ability to resell virtual appliances, and the ability to provide virtual machine-based evaluations and demos, without the cost of evaluation hardware inventory,” he said.
“Virtualisation also potentially provides the infrastructure to facilitate the deployment of SOA [Service Oriented Architecture].”
Virtualisation projects are typi-cally going to be highly complex, but the market is developing and resellers that are not afraid of addressing technical issues can probably still gain a foothold in the market. The opportunities look to be very significant in the future.
Green said that it is only a slight exaggeration to say that large telcos and multinationals are beating a path to vendors’ doors for server virtualisation.
“It is now beginning to attract the attention of less risk-averse players in the mid-market,” he said. “The public sector, another notoriously risk averse market, has huge potential too. As councils look to offer more and more services online the need to support these with reliable back office systems is driving IT departments to consider virtualisation.”
Green added that virtualisation is an easy sell. “The benefits of virtualisation generally, and server virtualisation specifically, are off the scale,” he said.
“We are talking about comfortably increasing server utilisation from less than 10 per cent to 50 or 60 per cent. We are working with customers that have reduced their rack space from 24 servers down to just six in hosted data centres, saving money and increasing reliability, since the servers are being used as a single capacity resource, rather than being run in isolation.”
The savings in manpower are huge as well, because fewer servers means fewer bodies are required for hands-on server administration. In the end, virtualisation comes down to that: the savings customers can make on the bottom line, and they can be very considerable indeed.
Even so, some customers see virtualisation as too big a leap of faith and others can put off due to the significant capital outlay they are compelled to make in order to make it work. It it is important to have answers to questions in these areas before taking the concept to the customer, says Clayton. “They can be put off by the idea of putting all their eggs in one basket, so making sure that they have a graceful failover mechanism is vital, as a hardware failure can now potentially disrupt service from a number of servers.
“Another thing which can put them off is the cost of the hardware potentially required in order to use virtualisation effectively - for an organisation used to buying single-purpose servers, it can be off-putting to purchase a much more expensive piece of kit.”
Even so, attitudes have softened in the past year and if there is a time to take virtualisation to the market, it is now. “Twelve months ago there probably was some confusion about virtualisation and its potential, but that seems to have dissipated”, says Clayton. “Now there is maybe only one customer in 20 which I speak to who is unaware of virtualisation and its benefits.”