The promise of utility computing
Like many other big IT ideas, it remains to be seen whether utility computing will deliver. Nick Booth investigates the potential for resellers to cash in either way.
All over the country, IT managers are being hauled over the coals for their profligate spending on IT. Where, their inquisitors are demanding, is the return on all that massive IT investment?
And the bean counters have a point. Money has been thrown at IT willy-nilly. It is time for a change in the way IT is bought and managed.
Consider, for example, the purchase of a customer relationship management (CRM) system. Typically, companies have five servers with three more acting as standby failover units. These eight need to be mirrored, as back-up, then there will be another eight used for a development environment and another eight for a test centre.
In all, that amounts to 32 processors when only five are actually needed. Given the fact that many applications only ever reach a peak of activity once a month, this represents a massive over-deployment of IT. And this is before you consider that many applications, like CRM, have failed to deliver on the promises.
It is little wonder that many IT directors are having problems justifying any more expenditure. Unless they can demonstrate more efficient use of their IT systems, it will be hard to get a sign-off on any future purchases. This is why the idea of utility computing is suddenly being taken seriously.
Understanding the benefits
Like many IT concepts, there are several schools of thought regarding the form and purpose of utility computing, which tends to make it hard to give users a clear message about its benefits.
In today's economic climate the most attractive aspect of utility computing is its efficiency; companies will definitely get more for their money.
Whether this is because they will start to buy IT services from a service provider, or whether it is because their own IT infrastructure can be pooled and made into a more fluid system, is a moot point. The consensus is that utility computing is a great idea, in theory.
But then application service providers (ASPs) were also a great idea. Didn't that model promise to allow end-users to purchase only what they needed from a service provider, which would supply IT services in the same way we would expect to buy gas or electricity?
Peter Hindle, Hewlett Packard's (HP's) utility computing marketing manager, admitted that this is true, but he insists that there are major differences.
"HP has always had the vision that computing should be a utility. Unlike the ASPs, we're not pretending that we've arrived with the finished product yet. The other difference is that we are more focused on who we would offer utility computing to," he explained.
ASPs tried to target everyone, but utility computing will be aimed mainly at enterprises. Some of these, such as American Express, are trying out the idea, but mass acceptance is a few years off, according to Hindle.
Alex Tatham, vice president of global software distribution at Bell Microproducts, suggested that the ASP market was a stepping stone to utility computing. The software and storage sectors are falling in line with this model too.
"The day when software is bought as a utility isn't five years away, it's just around the corner," he said. "You only have to look at how Microsoft has been gearing itself up to move to annuity billing to understand where IT is going.
"As a distributor we are being asked to facilitate software distribution agreements that allow for monthly billing for software usage.
"So software is nearly a utility now. Services will get there soon, but the mistake ASPs made was in trying to jump to a full service model straight off. But all the strands of IT are coming together so that computing can be bought on a usage basis."
Storage area networks (Sans) are an embodiment of the utility computing ideal. Storage is pooled into one shared resource, which can be allocated according to needs. "What Sans taught people was that they need to use their resources more carefully," said Tatham. "Perhaps this will alert them to the need to use all IT resources better."
Although Hindle claims that HP has surveyed 200 major enterprises and found that the top three considerations among IT managers are "cost, cost and cost", and that IT managers are concerned about managing resources, there are still considerable objections to the sale of utility computing.
Aside from the fact that ASPs promised many of the same features - manageable costs, scalable IT solutions and adaptability - and failed, there are also many technical issues to be resolved.
The integration challenge
The challenge is to integrate all the different computing platforms so that, to the user, they appear as one computing resource that is constantly on tap and can cater for whatever demands are made on it.
In practice this means that a huge amount of integration and configuration must go on behind the scenes to ensure that all applications, storage, processors and operating systems are one fluid unit that can be instantly tailored.
"The aim is to have every resource allocated efficiently by 'virtualising' the physical layer from the applications," explained Peter O'Sullivan, director at service provider IOKO365. "When you consider how many different competing standards and interfaces there are at every level of computing, it is a massive undertaking."
O'Sullivan estimates that it will take a big corporation with a mixture of computing platforms between five and 10 years to reach the point at which all IT resources are 'virtualised'.
But that is not to say there are no opportunities today. It could mean 10 years' worth of consultation and integration work. If companies are not so ready to splash out on hardware and software anymore, surely that means they will invest more in services. This is where all of the profit margins are these days for the channel.
"Our role is to guide corporates through the design dead ends and apply our experience of integrating systems that use different standards," said O'Sullivan. "It's a long tortuous path and, as with many new IT concepts, we may find we come out with something very different from what was envisaged originally."
In the meantime there is plenty for the channel to get involved in, not just in guiding users through the existing integration issues, but in helping them to make the right choices as new strategies are forced on them.
Web services, for example, will have an important role in utility computing, as the interfaces to these vast flexible computing resources will need to be powerful and flexible. But users will need guidance on picking the right standards that support the right web services for them, which will create further consultancy opportunities.
On the other hand, utility computing would be a lot easier if the underlying computing fabric came from one manufacturer. Granted, most big enterprises will want to make do with the systems they have already invested millions in and will strive to reshape them to fit the utility computing model.
But, according to Gartner, there are many enterprises that are considering outsourcing IT to a utility provider.
Of all the major IT manufacturers that have been developing utility computing systems, HP has a massive lead on the likes of Sun, IBM and the rest of the pack. Jim Cassell, Gartner's research vice president, pointed out that HP has an 18-month lead in 'self-managed systems' through its Utility Data Center offering.
Amanda Koenig, HP's infrastructure marketing manager, claims that the company is in talks with three global customers with a view to actual deployments.
Cassell said: "IBM was late in committing to heterogeneous management, but is now taking its parallel Sysplex technologies, such as partitioning, into open systems. This is a must for IBM, as it has four other platforms besides the zSeries."
But he warned that there is a long way to go, and HP's lead could be neutralised over the next half-decade or so.
"One of the big emotional barriers to the take-up of utility computing is that no one can own the system. Organisations have to have enough complexity to be able to justify getting past that minimum entry cost, from 30 servers and up," he explained.
"In utility computing you cannot license to a particular server. That may require a move to usage-based pricing, something that many software vendors will resist."
The ideal solution?
Having said that, Cassell maintains that utility computing is an ideal solution for service providers, because it allows them to "snap-in customers", but there are still a lot of unanswered questions about utility computing.
If IT is to be a utility that you can switch on and off, like gas or electricity, will corporations have exactly the same relationship as they have with utility providers?
Will you really be able to switch suppliers? Does that mean it will be sold by door-to-door cold-callers asking us to switch our computing supplier? There is the impression that, as with the ASP market, nobody has thought of the downside.
Robin Bloor, chief executive of Bloor Research, believes that utility computing is a market worth sticking with for the channel, if you can get in early and establish a lead. "Utility computing is little more than a theory at present, but ultimately, its reach will be far and wide," he said.
There is a huge opportunity for resellers in utility computing, according to Peter Roberts, vice president of partnerships at grid computing solution provider Platform Computing.
"They could be third-party utility computing service providers or provide hardware and software for firms, or be service providers that want to deploy utility computing. Resellers may even adopt a variation of utility computing where servers are rented to end-users that pay for what they use," he said.
"Sectors that have the greatest need for processing power, such as life sciences, financial services and engineering and manufacturing firms that have a large reliance on electronic design applications, provide the greatest opportunity."
THE MAIN WEB SERVICES STANDARDS
Well Defined Services is an XML document that provides non-operational information about a service, such as a description, service category and the company that created it.
Network Accessible Service Specification Language is XML-based and defines what is required to interface to the service at execution time, allowing different web services to work together.
Universal Description, Discovery and Integration registry allows providers to make service descriptions, products and services and industry classifications available on the network.
WHAT IS UTILITY COMPUTING?
If computer resources could be pooled and allocated according to needs as they arise, there is massive potential for efficiency savings. But how do we actually achieve true utility computing?
Utility computing consists of four key layers. At the very top is business process outsourcing, where there may be an outsider to do company accounts or implement human resources processes, for example.
Beneath that sits the ASP itself, delivering applications to businesses and removing the need for much of the daily IT grind. Underneath lies new realms of opportunity in common data layers (CDL).
Pioneered by peer-to-peer file sharing company Napster, CDL is a prototype for providing a common infrastructure for sharing data between businesses.
Instead of a firm holding customer data it could be held in a common repository with key IDs pointing the way to companies' information held within a common and possibly distributed data pool. This would cut down on the huge amount of data duplication that happens in the business world.
Finally, there is the hardware tier which, in utility computing terms, is referred to as grid computing: a collection of disparate processors and other resources working in a co-ordinated fashion to provide computational power.
CONTACTS
IOKO365 (020) 7297 4660
www.ioko365.com
Platform Computing (01256) 370 500
www.platform.com
Hewlett Packard (01344) 365 706
www.hp.com/largeinfrastructure/utilitycomputing
Bloor Research (01908) 625 100
www.bloor-research.com
Bell Microproducts (020) 8286 5000
www.ideal.co.uk